Google Trends Shows the World Shifts Toward Bitcoin (BTC

Mining for Profitability - Horizen (formerly ZenCash) Thanks Early GPU Miners

Mining for Profitability - Horizen (formerly ZenCash) Thanks Early GPU Miners
Thank you for inviting Horizen to the GPU mining AMA!
ZEN had a great run of GPU mining that lasted well over a year, and brought lots of value to the early Zclassic miners. It is mined using Equihash protocol, and there have been ASIC miners available for the algorithm since about June of 2018. GPU mining is not really profitable for Horizen at this point in time.
We’ve got a lot of miners in the Horizen community, and many GPU miners also buy ASIC miners. Happy to talk about algorithm changes, security, and any other aspect of mining in the questions below. There are also links to the Horizen website, blog post, etc. below.
So, if I’m not here to ask you to mine, hold, and love ZEN, what can I offer? Notes on some of the lessons I’ve learned about maximizing mining profitability. An update on Horizen - there is life after moving on from GPU mining. As well as answering your questions during the next 7 days.
_____________________________________________________________________________________________________

Mining for Profitability - Horizen (formerly ZenCash) Thanks Early GPU Miners

Author: Rolf Versluis - co-founder of Horizen

In GPU mining, just like in many of the activities involved with Bitcoin and cryptocurrencies, there is both a cycle and a progression. The Bitcoin price cycle is fairly steady, and by creating a personal handbook of actions to take during the cycle, GPU miners can maximize their profitability.
Maximizing profitability isn't the only aspect of GPU mining that is important, of course, but it is helpful to be able to invest in new hardware, and be able to have enough time to spend on building and maintaining the GPU miners. If it was a constant process that also involved losing money, then it wouldn't be as much fun.

Technology Progression

For a given mining algorithm, there is definitely a technology progression. We can look back on the technology that was used to mine Bitcoin and see how it first started off as Central Processing Unit (CPU) mining, then it moved to Graphical Processing Unit (GPU) mining, then Field Programmable Gate Array (FPGA), and then Application Specific Integrated Circuit (ASIC).
Throughout this evolution we have witnessed a variety of unsavory business practices that unfortunately still happen on occasion, like ASIC Miner manufacturers taking pre-orders 6 months in advance, GPU manufacturers creating commercial cards for large farms that are difficult for retail customers to secure and ASIC Miner manufacturers mining on gear for months before making it available for sale.
When a new crypto-currency is created, in many cases a new mining algorithm is created also. This is important, because if an existing algorithm was used, the coin would be open to a 51% attack from day one, and may not even be able to build a valid blockchain.
Because there's such a focus on profitable software, developers for GPU mining applications are usually able to write a mining application fairly rapidly, then iterate it to the limit of current GPU technology. If it looks like a promising new cryptocurrency, FPGA stream developers and ASIC Hardware Developers start working on their designs at the same time.
The people who create the hashing algorithms run by the miners are usually not very familiar with the design capabilities of Hardware manufacturers. Building application-specific semiconductors is an industry that's almost 60 years old now, and FPGA’s have been around for almost 35 years. This is an industry that has very experienced engineers using advanced design and modeling tools.
Promising cryptocurrencies are usually ones that are deploying new technology, or going after a big market, and who have at least a team of talented software developers. In the best case, the project has a full-stack business team involving development, project management, systems administration, marketing, sales, and leadership. This is the type of project that attracts early investment from the market, which will drive the price of the coin up significantly in the first year.
For any cryptocurrency that's a worthwhile investment of time, money, and electricity for the hashing, there will be a ASIC miners developed for it. Instead of fighting this technology progression, GPU miners may be better off recognizing it as inevitable, and taking advantage of the cryptocurrency cycle to maximize GPU mining profitability instead.

Cryptocurrency Price Cycle

For quality crypto projects, in addition to the one-way technology progression of CPU -> GPU -> FPGA -> ASIC, there is an upward price progression. More importantly, there is a cryptocurrency price cycle that oscillates around an overall upgrade price progression. Plotted against time, a cycle with an upward progressions looks like a sine wave with an ever increasing average value, which is what we see so far with the Bitcoin price.

Cryptocurrency price cycle and progression for miners
This means mining promising new cryptocurrencies with GPU miners, holding them as the price rises, and being ready to sell a significant portion in the first year. Just about every cryptocurrency is going to have a sharp price rise at some point, whether through institutional investor interest or by being the target of a pump-and-dump operation. It’s especially likely in the first year, while the supply is low and there is not much trading volume or liquidity on exchanges.
Miners need to operate in the world of government money, as well as cryptocurrency. The people who run mining businesses at some point have to start selling their mining proceeds to pay the bills, and to buy new equipment as the existing equipment becomes obsolete. Working to maximize profitability means more than just mining new cryptocurrencies, it also means learning when to sell and how to manage money.

Managing Cash for Miners

The worst thing that can happen to a business is to run out of cash. When that happens, the business usually shuts down and goes into bankruptcy. Sometimes an investor comes in and picks up the pieces, but at the point the former owners become employees.
There are two sides to managing cash - one is earning it, the other is spending it, and the cryptocurrency price cycle can tell the GPU miner when it is the best time to do certain things. A market top and bottom is easy to recognize in hindsight, and harder to see when in the middle of it. Even if a miner is able to recognize the tops and bottoms, it is difficult to act when there is so much hype and positivity at the top of the cycle, and so much gloom and doom at the bottom.
A decent rule of thumb for the last few cycles appears to be that at the top and bottom of the cycle BTC is 10x as expensive compared to USD as the last cycle. Newer crypto projects tend to have bigger price swings than Bitcoin, and during the rising of the pricing cycle there is the possibility that an altcoin will have a rise to 100x its starting price.
Taking profits from selling altcoins during the rise is important, but so is maintaining a reserve. In order to catch a 100x move, it may be worth the risk to put some of the altcoin on an exchange and set a very high limit order. For the larger cryptocurrencies like Bitcoin it is important to set trailing sell stops on the way up, and to not buy back in for at least a month if a sell stop gets triggered. Being able to read price charts, see support and resistance areas for price, and knowing how to set sell orders are an important part of mining profitability.

Actions to Take During the Cycle

As the cycle starts to rise from the bottom, this is a good time to buy mining hardware - it will be inexpensive. Also to mine and buy altcoins, which are usually the first to see a price rise, and will have larger price increases than Bitcoin.
On the rise of the cycle, this is a good time to see which altcoins are doing well from a project fundamentals standpoint, and which ones look like they are undergoing accumulation from investors.
Halfway through the rise of the cycle is the time to start selling altcoins for the larger project cryptos like Bitcoin. Miners will miss some of the profit at the top of the cycle, but will not run out of cash by doing this. This is also the time to stop buying mining hardware. Don’t worry, you’ll be able to pick up that same hardware used for a fraction of the price at the next bottom.
As the price nears the top of the cycle, sell enough Bitcoin and other cryptocurrencies to meet the following projected costs:
  • Mining electricity costs for the next 12 months
  • Planned investment into new miners for the next cycle
  • Additional funds needed for things like supporting a family or buying a Lambo
  • Taxes on all the capital gains from the sale of cryptocurrencies
It may be worth selling 70-90% of crypto holdings, maintaining a reserve in case there is second upward move caused by government bankruptcies. But selling a large part of the crypto is helpful to maintaining profitability and having enough cash reserves to make it through the bottom part of the next cycle.
As the cycle has peaked and starts to decline, this is a good time to start investing in mining facilities and other infrastructure, brush up on trading skills, count your winnings, and take some vacation.
At the bottom of the cycle, it is time to start buying both used and new mining equipment. The bottom can be hard to recognize.
If you can continue to mine all the way through bottom part of the cryptocurrency pricing cycle, paying with the funds sold near the top, you will have a profitable and enjoyable cryptocurrency mining business. Any cryptocurrency you are able to hold onto will benefit from the price progression in the next higher cycle phase.

An Update on Horizen - formerly ZenCash

The team at Horizen recognizes the important part that GPU miners played in the early success of Zclassic and ZenCash, and there is always a welcoming attitude to any of ZEN miners, past and present. About 1 year after ZenCash launched, ASIC miners became available for the Equihash algorithm. Looking at a chart of mining difficulty over time shows when it was time for GPU miners to move to mining other cryptocurrencies.

Horizen Historical Block Difficulty Graph
Looking at the hashrate chart, it is straightforward to see that ASIC miners were deployed starting June 2018. It appears that there was a jump in mining hashrate in October of 2017. This may have been larger GPU farms switching over to mine Horizen, FPGA’s on the network, or early version of Equihash ASIC miners that were kept private.
The team understands the importance of the cryptocurrency price cycle as it affects the funds from the Horizen treasury and the investments that can be made. 20% of each block mined is sent to the Horizen non-profit foundation for use to improve the project. Just like miners have to manage money, the team has to decide whether to spend funds when the price is high or convert it to another form in preparation for the bottom part of the cycle.
During the rise and upper part of the last price cycle Horizen was working hard to maximize the value of the project through many different ways, including spending on research and development, project management, marketing, business development with exchanges and merchants, and working to create adoption in all the countries of the world.
During the lower half of the cycle Horizen has reduced the team to the essentials, and worked to build a base of users, relationships with investors, exchanges, and merchants, and continue to develop the higher priority software projects. Lower priority software development, going to trade shows, and paying for business partnerships like exchanges and applications have all been completely stopped.
Miners are still a very important part of the Horizen ecosystem, earning 60% of the block reward. 20% goes to node operators, with 20% to the foundation. In the summer of 2018 the consensus algorithm was modified slightly to make it much more difficult for any group of miners to perform a 51% attack on Horizen. This has so far proven effective.
The team is strong, we provide monthly updates on a YouTube live stream on the first Wednesday of each month where all questions asked during the stream are addressed, and our marketing team works to develop awareness of Horizen worldwide. New wallet software was released recently, and it is the foundation application for people to use and manage their ZEN going forward.
Horizen is a Proof of Work cryptocurrency, and there is no plan to change that by the current development team. If there is a security or centralization concern, there may be change to the algorithm, but that appears unlikely at this time, as the hidden chain mining penalty looks like it is effective in stopping 51% attacks.
During 2019 and 2020 the Horizen team plans to release many new software updates:
  • Sidechains modification to main software
  • Sidechain Software Development Kit
  • Governance and Treasury application running on a sidechain
  • Node tracking and payments running on a sidechain
  • Conversion from blockchain to a Proof of Work BlockDAG using Equihash mining algorithm
After these updates are working well, the team will work to transition Horizen over to a governance model where major decisions and the allocation of treasury funds are done through a form of democratic voting. At this point all the software developed by Horizen is expected to be open source.
When the governance is transitioned, the project should be as decentralized as possible. The goal of decentralization is to enable resilience and preventing the capture of the project by regulators, government, criminal organizations, large corporations, or a small group of individuals.
Everyone involved with Horizen can be proud of what we have accomplished together so far. Miners who were there for the early mining and growth of the project played a large part in securing the network, evangelizing to new community members, and helping to create liquidity on new exchanges. Miners are still a very important part of the project and community. Together we can look forward to achieving many new goals in the future.

Here are some links to find out more about Horizen.
Horizen Website – https://horizen.global
Horizen Blog – https://blog.horizen.global
Horizen Reddit - https://www.reddit.com/Horizen/
Horizen Discord – https://discord.gg/SuaMBTb
Horizen Github – https://github.com/ZencashOfficial
Horizen Forum – https://forum.horizen.global/
Horizen Twitter – https://twitter.com/horizenglobal
Horizen Telegram – https://t.me/horizencommunity
Horizen on Bitcointalk – https://bitcointalk.org/index.php?topic=2047435.0
Horizen YouTube Channel – https://www.youtube.com/c/Horizen/
Buy or Sell Horizen
Horizen on CoinMarketCap – https://coinmarketcap.com/currencies/zencash/

About the Author:

Rolf Versluis is Co-Founder and Executive Advisor of the privacy oriented cryptocurrency Horizen. He also operates multiple private cryptocurrency mining facilities with hundreds of operational systems, and has a blog and YouTube channel on crypto mining called Block Operations.
Rolf applies his engineering background as well as management and leadership experience from running a 60 person IT company in Atlanta and as a US Navy nuclear submarine officer operating out of Hawaii to help grow and improve the businesses in which he is involved.
_____________________________________________________________________________________________
Thank you again for the Ask Me Anything - please do. I'll be checking the post and answering questions actively from 28 Feb to 6 Mar 2019 - Rolf
submitted by Blockops to gpumining [link] [comments]

An extensive list of blockchain courses, resources and articles to help you get a job working with blockchain.

u/Maximus_no and me spent some time at work collecting and analyzing learning material for blockchain development. The list contains resources for developers, as well as business analysts/consultants looking to learn more about blockchain use-cases and solutions.

Certifications and Courses

IIB Council
Link to course: IIB council : Certified Blockchain Professional
C|BP is an In-Depth, Industry Agnostic, Hands-On Training and Certification Course specifically tailored for Industry Professionals and Developers interested in implementing emerging technologies in the Data-Driven Markets and Digitized Economies.
The IIB Council Certified Blockchain Professional (C|BP) Course was developed to help respective aspiring professionals gain excessive knowledge in Blockchain technology and its implication on businesses.
WHO IS IT FOR:

Professionals

C|BP is developed in line with the latest industry trends to help current and aspiring Professionals evolve in their career by implementing the latest knowledge in blockchain technology. This course will help professionals understand the foundation of Blockchain technology and the opportunities this emerging technology is offering.

Developers

If you are a Developer and you are willing to learn blockchain technology this course is for you. You will learn to build and model Blockchain solutions and Blockchain-based applications for enterprises and businesses in multiple Blockchain Technologies.

Certified Blockchain Business Foundations (CBBF)

This exam is designed for non-technical business professionals who require basic knowledge about Blockchain and how it will be executed within an organization. This exam is NOT appropriate for technology professionals seeking to gain deeper understanding of Blockchain technology implementation or programming.

A person who holds this certification demonstrates their knowledge of:

· What is Blockchain? (What exactly is it?)
· Non-Technical Technology Overview (How does it work?)
· Benefits of Blockchain (Why should anyone consider this?)
· Use Cases (Where and for what apps is it appropriate?)
· Adoption (Who is using it and for what?)
· Future of Blockchain (What is the future?)

Certified Blockchain Solution Architect (CBSA)

A person who holds this certification demonstrates their ability to:

· Architect blockchain solutions
· Work effectively with blockchain engineers and technical leaders
· Choose appropriate blockchain systems for various use cases
· Work effectively with both public and permissioned blockchain systems

This exam will prove that a student completely understands:

· The difference between proof of work, proof of stake, and other proof systems and why they exist
· Why cryptocurrency is needed on certain types of blockchains
· The difference between public, private, and permissioned blockchains
· How blocks are written to the blockchain
· Where cryptography fits into blockchain and the most commonly used systems
· Common use cases for public blockchains
· Common use cases for private & permissioned blockchains
· What is needed to launch your own blockchain
· Common problems & considerations in working with public blockchains
· Awareness of the tech behind common blockchains
· When is mining needed and when it is not
· Byzantine Fault Tolerance
· Consensus among blockchains
· What is hashing
· How addresses, public keys, and private keys work
· What is a smart contract
· Security in blockchain
· Brief history of blockchain
· The programming languages of the most common blockchains
· Common testing and deployment practices for blockchains and blockchain-based apps

Certified Blockchain Developer - Ethereum (CBDE)

A person who holds this certification demonstrates their ability to:

· Plan and prepare production ready applications for the Ethereum blockchain
· Write, test, and deploy secure Solidity smart contracts
· Understand and work with Ethereum fees
· Work within the bounds and limitations of the Ethereum blockchain
· Use the essential tooling and systems needed to work with the Ethereum ecosystem

This exam will prove that a student completely understands how to:

· Implement web3.js
· Write and compile Solidity smart contracts
· Create secure smart contracts
· Deploy smart contracts both the live and test Ethereum networks
· Calculate Ethereum gas costs
· Unit test smart contracts
· Run an Ethereum node on development machines

Princeton: Sixty free lectures from Princeton on bitcoin and cryptocurrencies. Avg length ~15 mins

Basic course with focus on Bitcoin. After this course, you’ll know everything you need to be able to separate fact from fiction when reading claims about Bitcoin and other cryptocurrencies. You’ll have the conceptual foundations you need to engineer secure software that interacts with the Bitcoin network. And you’ll be able to integrate ideas from Bitcoin in your own projects.

MIT : BLOCKCHAIN TECHNOLOGIES: BUSINESS INNOVATION AND APPLICATION

· A mid / basic understanding of blockchain technology and its long-term implications for business, coupled with knowledge of its relationship to other emerging technologies such as AI and IoT
· An economic framework for identifying blockchain-based solutions to challenges within your own context, guided by the knowledge of cryptoeconomics expert Christian Catalini
· Recognition of your newfound blockchain knowledge in the form of a certificate of completion from the MIT Sloan School of Management — one of the world’s leading business schools
Orientation Module: Welcome to Your Online Campus
Module 1: An introduction to blockchain technology
Module 2: Bitcoin and the curse of the double-spending problem
Module 3: Costless verification: Blockchain technology and the last mile problem
Module 4: Bootstrapping network effects through blockchain technology and cryptoeconomics
Module 5: Using tokens to design new types of digital platforms
Module 6: The future of blockchain technology, AI, and digital privacy

Oxford Blockchain Strategy Programme

· A mid / basic understanding of what blockchain is and how it works, as well as insights into how it will affect the future of industry and of your organization.
· The ability to make better strategic business decisions by utilizing the Oxford Blockchain Strategic framework, the Oxford Blockchain Regulation framework, the Oxford Blockchain Ecosystem map, and drawing on your knowledge of blockchain and affiliated industries and technologies.
· A certificate of attendance from Oxford Saïd as validation of your newfound blockchain knowledge and skills, as well as access to a global network of like-minded business leaders and innovators.
Module 1: Understanding blockchain
Module 2: The blockchain ecosystem
Module 3: Innovations in value transfer
Module 4: Decentralized apps and smart contracts
Module 5: Transforming enterprise business models
Module 6: Blockchain frontiers

Resources and Articles

Introduction to Distributed Ledger Technologies (DLT) https://www.ibm.com/developerworks/cloud/library/cl-blockchain-basics-intro-bluemix-trs/
Tomas’s Personal Favourite: 150+ Resources for going from web-dev to blockchain engineer https://github.com/benstew/blockchain-for-software-engineers
Hyperledger Frameworks Hyperledger is widely regarded as the most mature open-source framework for building private & permissioned blockchains.
Tutorials: https://www.hyperledger.org/resources/training
R3 Corda Open-source developer frameworks for building private, permissioned blockchains. A little better than Hyperledger on features like privacy and secure channels. Used mostly in financial applications.
Ethereum, Solidity, dApps and Smart-Contracts
Ethereum & Solidity Course (favourite): https://www.udemy.com/ethereum-and-solidity-the-complete-developers-guide/
An Introduction to Ethereum’s Token Standards: https://medium.com/coinmonks/anatomy-of-an-erc-an-exhaustive-survey-8bc1a323b541
How To Create Your First ERC20 Token: https://medium.com/bitfwd/how-to-do-an-ico-on-ethereum-in-less-than-20-minutes-a0062219374
Ethereum Developer Tools [Comprehensive List]: https://github.com/ConsenSys/ethereum-developer-tools-list/blob/masteREADME.md
CryptoZombies – Learn to code dApps through game-development: https://cryptozombies.io/
Intro to Ethereum Development: https://hackernoon.com/ethereum-development-walkthrough-part-1-smart-contracts-b3979e6e573e
Notes from Consensys Academy Participant (free): https://github.com/ScottWorks/ConsenSys-Academy-Notes
AWS Ethereum Templates: https://aws.amazon.com/blogs/aws/get-started-with-blockchain-using-the-new-aws-blockchain-templates/
Create dApps with better user-experience: https://blog.hellobloom.io/how-to-make-a-user-friendly-ethereum-dapp-5a7e5ea6df22
Solidity YouTube Course: https://www.youtube.com/channel/UCaWes1eWQ9TbzA695gl_PtA
[UX &UI] Designing a decentralized profile dApp: https://uxdesign.cc/designing-a-decentralized-profile-dapp-ab12ead4ab56
Scaling Solutions on Ethereum: https://media.consensys.net/the-state-of-scaling-ethereum-b4d095dbafae
Different Platforms for dApps and Smart-Contracts
While Ethereum is the most mature dApp framework with both the best developer tools, resources and community, there are other public blockchain platforms. Third generation blockchains are trying to solve Ethereum’s scaling and performance issues. Here is an overview of dApp platforms that can be worth looking into:
NEO - https://neo.org/ The second most mature dApp platform. NEO has better scalability and performance than Ethereum and has 1’000 TPS to ETH’s 15 by utilizing a dBFT consensus algorithm. While better infrastructure, NEO does not have the maturity of Ethereum’s developer tools, documentation and community.
A writeup on why a company chose to develop on NEO and not Ethereum: https://medium.com/orbismesh/why-we-chose-neo-over-ethereum-37fc9208ffa0
Cardano - https://www.cardano.org/en/home/ While still in alpha with a long and ambitious roadmap ahead of it, Cardano is one of the most anticipated dApp platforms out there. IOHK, the research and engineering company that maintains Cardano, has listed a lot of great resources and scientific papers that is worth looking into.
An Intro to Cardano: https://hackernoon.com/cardano-ethereum-and-neo-killer-or-overhyped-and-overpriced-8fcd5f8abcdf
IOHK Scientific Papers - https://iohk.io/research/papers/
Stellar - https://www.stellar.org/ If moving value fast from one party to another by using smart-contracts is the goal, Stellar Lumens is your platform. Initially as an open-source fork from Ripple, Stellar has become one of the mature frameworks for financial applications. Stellar’s focus lies in interoperability with legacy financial systems and cheap/fast value transfer. It’s smart-contract capability is rather limited in comparison to Ethereum and HyperLedger, so take that in consideration.
Ripplewww.ripple.com Ripple and its close cousin, Stellar, is two of the most well-known cryptocurrencies and DLT frameworks meant for the financial sector. Ripple enables instant settlement between banks for international transactions.

Consensus Algorithms

[Proof of Work] - very short, cuz it's well-known.
[1] Bitcoin - to generate a new block miner must generate hash of the new block header that is in line with given requirements.
Others: Ethereum, Litecoin etc.
[Hybrid of PoW and PoS]
[2] Decred - hybrid of “proof of work” and “proof of stake”. Blocks are created about every 5 minutes. Nodes in the network looking for a solution with a known difficulty to create a block (PoW). Once the solution is found it is broadcast to the network. The network then verifies the solution. Stakeholders who have locked some DCR in return for a ticket* now have the chance to vote on the block (PoS). 5 tickets are chosen pseudo-randomly from the ticket pool and if at least 3 of 5 vote ‘yes’ the block is permanently added to the blockchain. Both miners and voters are compensated with DCR : PoS - 30% and PoW - 60% of about 30 new Decred issued with a block. * 1 ticket = ability to cast 1 vote. Stakeholders must wait an average of 28 days (8,192 blocks) to vote their tickets.
[Proof of Stake]
[3] Nxt - The more tokens are held by account, the greater chance that account will earn the right to generate a block. The total reward received as a result of block generation is the sum of the transaction fees located within the block. Three values are key to determining which account is eligible to generate a block, which account earns the right to generate a block, and which block is taken to be the authoritative one in times of conflict: base target value, target value and cumulative difficulty. Each block on the chain has a generation signature parameter. To participate in the block's forging process, an active account digitally signs the generation signature of the previous block with its own public key. This creates a 64-byte signature, which is then hashed using SHA256. The first 8 bytes of the resulting hash are converted to a number, referred to as the account hit. The hit is compared to the current target value(active balance). If the computed hit is lower than the target, then the next block can be generated.
[4] Peercoin (chain-based proof of stake) - coin age parameter. Hybrid PoW and PoS algorithm. The longer your Peercoins have been stationary in your account (to a maximum of 90 days), the more power (coin age) they have to mint a block. The act of minting a block requires the consumption of coin age value, and the network determines consensus by selecting the chain with the largest total consumed coin age. Reward - minting + 1% yearly.
[5] Reddcoin (Proof of stake Velocity) - quite similar to Peercoin, difference: not linear coin-aging function (new coins gain weight quickly, and old coins gain weight increasingly slowly) to encourage Nodes Activity. Node with most coin age weight have a bigger chance to create block. To create block Node should calculate right hash. Block reward - interest on the weighted age of coins/ 5% annual interest in PoSV phase.
[6] Ethereum (Casper) - uses modified BFT consensus. Blocks will be created using PoW. In the Casper Phase 1 implementation for Ethereum, the “proposal mechanism" is the existing proof of work chain, modified to have a greatly reduced block reward. Blocks will be validated by set of Validators. Block is finalised when 2/3 of validators voted for it (not the number of validators is counted, but their deposit size). Block creator rewarded with Block Reward + Transaction FEES.
[7] Lisk (Delegated Proof-of-stake) - Lisk stakeholders vote with vote transaction (the weight of the vote depends on the amount of Lisk the stakeholder possess) and choose 101 Delegates, who create all blocks in the blockchain. One delegate creates 1 block within 1 round (1 round contains 101 blocks) -> At the beginning of each round, each delegate is assigned a slot indicating their position in the block generation process -> Delegate includes up to 25 transactions into the block, signs it and broadcasts it to the network -> As >51% of available peers agreed that this block is acceptable to be created (Broadhash consensus), a new block is added to the blockchain. *Any account may become a delegate, but only accounts with the required stake (no info how much) are allowed to generate blocks. Block reward - minted Lisks and transaction fees (fees for all 101 blocks are collected firstly and then are divided between delegates). Blocks appears every 10 sec.
[8] Cardano (Ouroboros Proof of Stake) - Blocks(slots) are created by Slot Leaders. Slot Leaders for N Epoch are chosen during n-1 Epoch. Slot Leaders are elected from the group of ADA stakeholders who have enough stake. Election process consist of 3 phases: Commitment phase: each elector generates a random value (secret), signs it and commit as message to network (other electors) saved in to block. -> Reveal phase: Each elector sends special value to open a commitment, all this values (opening) are put into the block. -> Recovery phase: each elector verifies that commitments and openings match and extracts the secrets and forms a SEED (randomly generated bytes string based on secrets). All electors get the same SEED. -> Follow the Satoshi algorithm : Elector who have coin which corresponded to SEED become a SLOT LEADER and get a right to create a block. Slot Leader is rewarded with minted ADA and transactions Fee.
[9] Tezos (Proof Of Stake) - generic and self-amending crypto-ledger. At the beginning of each cycle (2048 blocks), a random seed is derived from numbers that block miners chose and committed to in the penultimate cycle, and revealed in the last. -> Using this random seed, a follow the coin strategy (similar to Follow The Satoshi) is used to allocate mining rights and signing rights to stakeholders for the next cycle*. -> Blocks are mined by a random stakeholder (the miner) and includes multiple signatures of the previous block provided by random stakeholders (the signers). Mining and signing both offer a small reward but also require making a one cycle safety deposit to be forfeited in the event of a double mining or double signing.
· the more coins (rolls) you have - the more your chance to be a minesigner.
[10] Tendermint (Byzantine Fault Tolerance) - A proposal is signed and published by the designated proposer at each round. The proposer is chosen by a deterministic and non-choking round robin selection algorithm that selects proposers in proportion to their voting power. The proposer create the block, that should be validated by >2/3 of Validators, as follow: Propose -> Prevote -> Precommit -> Commit. Proposer rewarded with Transaction FEES.
[11] Tron (Byzantine Fault Tolerance) - This blockhain is still on development stage. Consensus algorithm = PoS + BFT (similar to Tendermint): PoS algorithm chooses a node as Proposer, this node has the power to generate a block. -> Proposer broadcasts a block that it want to release. -> Block enters the Prevote stage. It takes >2/3 of nodes' confirmations to enter the next stage. -> As the block is prevoted, it enters Precommit stage and needs >2/3 of node's confirmation to go further. -> As >2/3 of nodes have precommited the block it's commited to the blockchain with height +1. New blocks appears every 15 sec.
[12] NEO (Delegated Byzantine Fault Tolerance) - Consensus nodes* are elected by NEO holders -> The Speaker is identified (based on algorithm) -> He broadcasts proposal to create block -> Each Delegate (other consensus nodes) validates proposal -> Each Delegate sends response to other Delegates -> Delegate reaches consensus after receiving 2/3 positive responses -> Each Delegate signs the block and publishes it-> Each Delegate receives a full block. Block reward 6 GAS distributed proportionally in accordance with the NEO holding ratio among NEO holders. Speaker rewarded with transaction fees (mostly 0). * Stake 1000 GAS to nominate yourself for Bookkeeping(Consensus Node)
[13] EOS (Delegated Proof of Stake) - those who hold tokens on a blockchain adopting the EOS.IO software may select* block producers through a continuous approval voting system and anyone may choose to participate in block production and will be given an opportunity to produce blocks proportional to the total votes they have received relative to all other producers. At the start of each round 21 unique block producers are chosen. The top 20 by total approval are automatically chosen every round and the last producer is chosen proportional to their number of votes relative to other producers. Block should be confirmed by 2/3 or more of elected Block producers. Block Producer rewarded with Block rewards. *the more EOS tokens a stakeholder owns, the greater their voting power
[The XRP Ledger Consensus Process]
[14] Ripple - Each node receives transaction from external applications -> Each Node forms public list of all valid (not included into last ledger (=block)) transactions aka (Candidate Set) -> Nodes merge its candidate set with UNLs(Unique Node List) candidate sets and vote on the veracity of all transactions (1st round of consensus) -> all transactions that received at least 50% votes are passed on the next round (many rounds may take place) -> final round of consensus requires that min 80% of Nodes UNL agreeing on transactions. It means that at least 80% of Validating nodes should have same Candidate SET of transactions -> after that each Validating node computes a new ledger (=block) with all transactions (with 80% UNL agreement) and calculate ledger hash, signs and broadcasts -> All Validating nodes compare their ledgers hash -> Nodes of the network recognize a ledger instance as validated when a 80% of the peers have signed and broadcast the same validation hash. -> Process repeats. Ledger creation process lasts 5 sec(?). Each transaction includes transaction fee (min 0,00001 XRP) which is destroyed. No block rewards.
[The Stellar consensus protocol]
[15] Stellar (Federated Byzantine Agreement) - quite similar to Ripple. Key difference - quorum slice.
[Proof of Burn]
[16] Slimcoin - to get the right to write blocks Node should “burn” amount of coins. The more coins Node “burns” more chances it has to create blocks (for long period) -> Nodes address gets a score called Effective Burnt Coins that determines chance to find blocks. Block creator rewarded with block rewards.
[Proof of Importance]
[17] NEM - Only accounts that have min 10k vested coins are eligible to harvest (create a block). Accounts with higher importance scores have higher probabilities of harvesting a block. The higher amount of vested coins, the higher the account’s Importance score. And the higher amount of transactions that satisfy following conditions: - transactions sum min 1k coins, - transactions made within last 30 days, - recipient have 10k vested coins too, - the higher account’s Important score. Harvester is rewarded with fees for the transactions in the block. A new block is created approx. every 65 sec.
[Proof of Devotion]
[18] Nebulas (Proof of Devotion + BFT) - quite similar to POI, the PoD selects the accounts with high influence. All accounts are ranked according to their liquidity and propagation (Nebulas Rank) -> Top-ranked accounts are selected -> Chosen accounts pay deposit and are qualified as the blocks Validators* -> Algorithm pseudo-randomly chooses block Proposer -> After a new block is proposed, Validators Set (each Validator is charged a deposit) participate in a round of BFT-Style voting to verify block (1. Prepare stage -> 2. Commit Stage. Validators should have > 2/3 of total deposits to validate Block) -> Block is added. Block rewards : each Validator rewarded with 1 NAS. *Validators Set is dynamic, changes in Set may occur after Epoch change.
[IOTA Algorithm]
[19] IOTA - uses DAG (Directed Acyclic Graph) instead of blockchain (TANGLE equal to Ledger). Graph consist of transactions (not blocks). To issue a new transaction Node must approve 2 random other Transactions (not confirmed). Each transaction should be validate n(?) times. By validating PAST(2) transactions whole Network achieves Consensus. in Order to issue transaction Node: 1. Sign transaction with private key 2. choose two other Transactions to validate based on MCMC(Markov chain Monte Carlo) algorithm, check if 2 transactions are valid (node will never approve conflicting transactions) 3. make some PoW(similar to HashCash). -> New Transaction broadcasted to Network. Node don’t receive reward or fee.
[PBFT + PoW]
[20] Yobicash - uses PBFT and also PoW. Nodes reach consensus on transactions by querying other nodes. A node asks its peers about the state of a transaction: if it is known or not, and if it is a doublespending transaction or not. As follow : Node receives new transaction -> Checks if valid -> queries all known nodes for missing transactions (check if already in DAG ) -> queries 2/3 nodes for doublepsending and possibility -> if everything is ok add to DAG. Reward - nodes receive transaction fees + minting coins.
[Proof of Space/Proof of Capacity]
[21] Filecoin (Power Fault Tolerance) - the probability that the network elects a miner(Leader) to create a new block (it is referred to as the voting power of the miner) is proportional to storage currently in use in relation to the rest of the network. Each node has Power - storage in use verified with Proof of Spacetime by nodes. Leaders extend the chain by creating a block and propagating it to the network. There can be an empty block (when no leader). A block is committed if the majority of the participants add their weight on the chain where the block belongs to, by extending the chain or by signing blocks. Block creator rewarded with Block reward + transaction fees.
[Proof of Elapsed Time (POET)]
[22] Hyperledger Sawtooth - Goal - to solve BFT Validating Nodes limitation. Works only with intel’s SGX. PoET uses a random leader election model or a lottery based election model based on SGX, where the protocol randomly selects the next leader to finalize the block. Every validator requests a wait time from an enclave (a trusted function). -> The validator with the shortest wait time for a particular transaction block is elected the leader. -> The BlockPublisher is responsible for creating candidate blocks to extend the current chain. He takes direction from the consensus algorithm for when to create a block and when to publish a block. He creates, Finalizes, Signs Block and broadcast it -> Block Validators check block -> Block is created on top of blockchain.
[23] Byteball (Delegated Byzantine Fault Tolerance) - only verified nodes are allowed to be Validation nodes (list of requirements https://github.com/byteball/byteball-witness). Users choose in transaction set of 12 Validating nodes. Validating nodes(Witnesses) receive transaction fees.
[24] Nano - uses DAG, PoW (HashCash). Nano uses a block-lattice structure. Each account has its own blockchain (account-chain) equivalent to the account’s transaction/balance history. To add transaction user should make some HashCash PoW -> When user creates transaction Send Block appears on his blockchain and Receive block appears on Recipients blockchain. -> Peers in View receive Block -> Peers verify block (Double spending and check if already in the ledger) -> Peers achieve consensus and add block. In case of Fork (when 2 or more signed blocks reference the same previous block): Nano network resolves forks via a balance-weighted voting system where representative nodes vote for the block they observe, as >50% of weighted votes received, consensus achieved and block is retained in the Node’s ledger (block that lose the vote is discarded).
[25] Holochain - uses distributed hash table (DHT). Instead of trying to manage global consensus for every change to a huge blockchain ledger, every participant has their own signed hash chain. In case of multi-party transaction, it is signed to each party's chain. Each party signs the exact same transaction with links to each of their previous chain entries. After data is signed to local chains, it is shared to a DHT where every neighbor node validate it. Any consensus algorithms can be built on top of Holochain.
[26] Komodo ('Delegated' Delayed Proof of Work (dPoW)) - end-to-end blockchain solutions. DPoW consensus mechanism does not recognize The Longest Chain Rule to resolve a conflict in the network, instead the dPoW looks to backups it inserted previously into the chosen PoW blockchain. The process of inserting backups of Komodo transactions into a secure PoW is “notarization.” Notarisation is performed by the elected Notary nodes. Roughly every ten minutes, the Notary nodes perform a special block hash mined on the Komodo blockchain and take note of the overall Komodo blockchain “height”. The notary nodes process this specifc block so that their signatures are cryptographically included within the content of the notarized data. There are sixty-four “Notary nodes” elected by a stake-weighted vote, where ownership of KMD represents stake in the election. They are a special type of blockchain miner, having certain features in their underlying code that enable them to maintain an effective and cost-efcient blockchain and they periodically receives the privilege to mine a block on “easy difculty.”
Source: https://www.reddit.com/CryptoTechnology/comments/7znnq8/my_brief_observation_of_most_common_consensus/
Whitepapers Worth Looking Into:
IOTA -http://iotatoken.com/IOTA_Whitepaper.pdf
NANO -https://nano.org/en/whitepaper
Bitcoin -https://bitcoin.org/bitcoin.pdf
Ethereum: https://github.com/ethereum/wiki/wiki/White-Paper
Ethereum Plasma (Omise-GO) -https://plasma.io/plasma.pdf
Cardano - https://eprint.iacr.org/2016/889.pdf
submitted by heart_mind_body to CryptoCurrency [link] [comments]

We may have to accept higher fees until September 10th

The cause is the crazy difficulty swing in bcash, which affects bitcoin to some degree. You can see this here. The swing frequency is roughly once every 3 days.
What happens is that the stupidly designed difficulty adjustment algorithm in bcash, which, by the way, deviates from the design of Satoshi Nakamoto's white paper and thus from the bitcoin consensus, causes violent swings in difficulty and hash rate, because miners jump over to mine bcash when its difficulty is very low, mining 30 to 50 blocks per hour rather than the originally envisioned 6.
Conversely, during the high-difficulty phase almost all miners leave bcash, which has already led to block rates of fewer than one block every two hours, almost a standstill. Here is an illustrating graph.
While the miners are not mining bitcoin, the bitcoin block rate goes down to around 4 blocks per hour. During the time when miners jump back on bitcoin, it averages around 8 blocks per hour. Because the low-block-rate phase is longer, at least two days, the total average block rate is below 6, so bitcoin accumulates a backlog that leads to higher fees.
This is unpleasant, but it is also a mitigating effect, because it makes mining bitcoin more attractive, so fewer miners desert. Another mitigating effect is that very-low-value transactions become too expensive, so the total volume of transactions decreases.
Yet another positive effect is that the Segregated Witness upgrade will soon show first effects, increasing the total block size, but this will still take a little time.
What could happen is that some users see the craziness and the utterly stupid design of bcash and sell those coins before everybody else also notices and sells. A lower bcash price will make mining bcash less attractive and will thus also alleviate the problem.
submitted by hgmichna to Bitcoin [link] [comments]

The Great NiceHash Profit Explanation - for Sellers (the guys with the GPUs & CPUs)

Let's make a couple of things crystal clear about what you are not doing here:
But hey, I'm running MINING software!
What the hell am I doing then?!?
Who makes Profit, and how?
How is it possible everyone is making a profit?
Why do profits skyrocket, and will it last (and will this happen again)?
But my profits are decreasing all the time >:[
But why?!? I’m supposed to make lotsa money out of this!!!
But WHY!!!
  1. Interest hype -> Influx of Fiat money -> Coins quotes skyrocket -> Influx of miners -> Difficulty skyrockets -> Most of the price uptrend is choked within weeks, since it’s now harder to mine new blocks.
  2. Interest hype drains out -> Fiat money influx declines -> Coins quotes halt or even fall -> Miners still hold on to their dream -> Difficulty stays up high, even rises -> Earnings decrease, maybe even sharply, as it's still harder to mine new blocks, that may be even paid less.
So, how to judge what’s going on with my profits?
Simple breakdown of the relationship of BTC payouts by NiceHash, BTC/ALT Coins rates, and Fiat value:
BTC quote | ALTs quotes | BTC payout | Fiat value ----------------------------------------------------- UP | UP | stable*) | UP stable | UP | UP | UP UP | stable | DOWN | stable*) stable | stable | stable | stable DOWN | stable | UP | stable*) stable | DOWN | DOWN | DOWN DOWN | DOWN | stable*) | DOWN 
Some rather obvious remarks:
More help:
Disclaimer: I'm a user - Seller like you - not in any way associated with NiceHash; this is my personal view & conclusion about some more or less obvious basics in Crypto mining and particularly using NiceHash.
Comments & critics welcome...
submitted by t_3 to NiceHash [link] [comments]

INT - Comparison with Other IoT Projects

What defines a good IoT project? Defining this will help us understand what some of the problems they might struggle with and which projects excel in those areas. IoT will be a huge industry in the coming years. The true Internet 3.0 will be one of seamless data and value transfer. There will be a tremendous amount of devices connected to this network, from your light bulbs to your refrigerator to your car, all autonomously transacting together in an ever growing network in concert, creating an intelligent, seamless world of satisfying wants and needs.
.
Let’s use the vastness of what the future state of this network is to be as our basis of what makes a good project.
.
Scalability
In that future we will need very high scalability to accommodate the exponential growth in transaction volume that will occur. The network doesn’t need to have the ability to do high transactions per second in the beginning, just a robust plan to grow that ability as the network develops. We’ve seen this issue already with Bitcoin on an admittedly small market penetration. If scaling isn’t a one of the more prominent parts of your framework, that is a glaring hole.
.
Applicability
Second to scalability is applicability. One size does not fit all in this space. Some uses will need real-time streaming of data where fast and cheap transactions are key and others will need heavier transactions full of data to be analyzed by the network for predictive uses. Some uses will need smart contracts so that devices can execute actions autonomously and others will need the ability to encrypt data and to transact anonymously to protect the privacy of the users in this future of hyper-connectivity. We cannot possibly predict the all of the future needs of this network so the ease of adaptability in a network of high applicability is a must.
.
Interoperability
In order for this network to have the high level of applicability mentioned, it would need to have access to real world data outside of it’s network to work off of or even to transact with. This interoperability can come in several forms. I am not a maximalist, thinking that there will be one clear winner in any space. So it is easy, therefore, to imagine that we would want to be able to interact with some other networks for payment/settlement or data gathering. Maybe autonomously paying for bills with Bitcoin or Monero, maybe smart contracts that will need to be fed additional data from the Internet or maybe even sending an auto invite for a wine tasting for the wine shipment that’s been RFID’d and tracked through WTC. In either case, in order to afford the highest applicability, the network will need the ability to interact with outside networks.
.
Consensus
How the network gains consensus is often something that is overlooked in the discussion of network suitability. If the network is to support a myriad of application and transaction types, the consensus mechanism must be able to handle it without choking the network or restricting transaction type. PoW can become a bottleneck as the competition for block reward requires an increase in difficulty for block generation, you therefore have to allow time for this computation in between blocks, often leading to less than optimal block times for fast transactions. This can create a transaction backlog as we have seen before. PoS can solve some of these issues but is not immune to this either. A novel approach to gaining consensus will have to be made if it is going to handle the variety and volume to be seen.
.
Developability
All of this can be combined to create a network that is best equipped to take on the IoT ecosystem. But the penetration into the market will be solely held back by the difficulty in connecting and interacting with the network from the perspective of manufacturers and their devices. Having to learn a new code language in order to write a smart contract or create a node or if there are strict requirements on the hardware capability of the devices, these are all barriers that make it harder and more expensive for companies to work with the network. Ultimately, despite how perfect or feature packed your network is, a manufacturer will more likely develop devices for those that are easy to work with.
.
In short, what the network needs to focus on is:
-Scalability – How does it globally scale?
-Applicability – Does it have data transfer ability, fast, cheap transactions, smart contracts, privacy?
-Interoperability – Can it communicate with the outside world, other blockchains?
-Consensus – Will it gain consensus in a way that supports scalability and applicability?
-Developability – Will it be easy for manufactures to develop devices and interact with the network?
.
.
The idea of using blockchain technology to be the basis of the IoT ecosystem is not a new idea. There are several projects out there now that are aiming at tackling the problem. Below you will see a high level breakdown of those projects with some pros and cons from how I interpret the best solution to be. You will also see some supply chain projects listed below. Supply chain solutions are just small niches in the larger IoT ecosystem. Item birth record, manufacturing history, package tracking can all be “Things” which the Internet of Things track. In fact, INT already has leaked some information hinting that they are cooperating with pharmaceutical companies to track the manufacture and packaging of the drugs they produce. INT may someday include WTC or VEN as one of its subchains feeding in information into the ecosystem.
.
.
IOTA
IOTA is a feeless and blockchain-less network called a directed acyclic graph. In my opinion, this creates more issues than it fixes.
The key to keeping IOTA feeless is that there are no miners to pay because the work associated with verifying a transaction is distributed to among all users, with each user verifying two separate transactions for their one. This creates some problems both in the enabling of smart contracts and the ability to create user privacy. Most privacy methods (zk-SNARKs in specific) require the one doing the verifying to use computationally intensive cryptography which are outside the capability of most devices on the IoT network (a weather sensor isn’t going to be able to build the ZK proof of a transaction every second or two). In a network where the device does the verifying of a transaction, cryptographic privacy becomes impractical. And even if there were a few systems capable of processing those transactions, there is no reward for doing the extra work. Fees keep the network safe by incentivizing honesty in the nodes, by paying those who have to work harder to verify a certain transaction, and by making it expensive to attack the network or disrupt privacy (Sybil Attacks).
IOTA also doesn’t have and may never have the ability to enable smart contracts. By the very nature of the Tangle (a chain of transactions with only partial structure unlike a linear and organized blockchain), establishing the correct time order of transactions is difficult, and in some situations, impossible. Even if the transactions have been time stamped, there is no way to verify them and are therefore open to spoofing. Knowing transaction order is absolutely vital to executing step based smart contracts.
There does exist a subset of smart contracts that do not require a strong time order of transactions in order to operate properly. But accepting this just limits the use cases of the network. In any case, smart contracts will not be able to operate directly on chain in IOTA. There will need to be a trusted off chain Oracle that watches transactions, establishes timelines, and runs the smart contract network
.
-Scalability – High
-Applicability – Low, no smart contracts, no privacy, not able to run on lightweight devices
-Interoperability – Maybe, Oracle possibility
-Consensus – Low, DAG won’t support simple IoT devices and I don’t see all devices confirming other transactions as a reality
-Developability – To be seen, currently working with many manufacturers
.
.
Ethereum
Ethereum is the granddaddy of smart contract blockchain. It is, arguably, in the best position to be the center point of the IoT ecosystem. Adoption is wide ranging, it is fast, cheap to transact with and well known; it is a Turing complete decentralized virtual computer that can do anything if you have enough gas and memory. But some of the things that make it the most advanced, will hold it back from being the best choice.
Turing completeness means that the programming language is complete (can describe any problem) and can solve any problem given that there is enough gas to pay for it and enough memory to run the code. You could therefore, create an infinite variety of different smart contracts. This infinite variability makes it impossible to create zk-SNARK verifiers efficiently enough to not cost more gas than is currently available in the block. Implementing zk-SNARKs in Ethereum would therefore require significant changes to the smart contract structure to only allow a small subset of contracts to permit zk-SNARK transactions. That would mean a wholesale change to the Ethereum Virtual Machine. Even in Zcash, where zk-SNARK is successfully implemented for a single, simple transaction type, they had to encode some of the network’s consensus rules into zk-SNARKs to limit the possible outcomes of the proof (Like changing the question of where are you in the US to where are you in the US along these given highways) to limit the computation time required to construct the proof.
Previously I wrote about how INT is using the Double Chain Consensus algorithm to allow easy scaling, segregation of network traffic and blockchain size by breaking the network down into separate cells, each with their own nodes and blockchains. This is building on lessons learned from single chain blockchains like Bitcoin. Ethereum, which is also a single chain blockchain, also suffers from these congestion issues as we have seen from the latest Cryptokitties craze. Although far less of an impact than that which has been seen with Bitcoin, transaction times grew as did the fees associated. Ethereum has proposed a new, second layer solution to solve the scaling issue: Sharding. Sharding draws from the traditional scaling technique called database sharding, which splits up pieces of a database and stores them on separate servers where each server points to the other. The goal of this is to have distinct nodes that store and verify a small set of transactions then tie them up to a larger chain, where all the other nodes communicate. If a node needs to know about a transaction on another chain, it finds another node with that information. What does this sound like? This is as close to an explanation of the Double Chain architecture as to what INT themselves provided in their whitepaper.
.
-Scalability – Neutral, has current struggles but there are some proposals to fix this
-Applicability – Medium, has endless smart contract possibilities, no privacy currently with some proposals to fix this
-Interoperability – Maybe, Oracle possibility
-Consensus – Medium, PoW currently with proposals to change to better scaling and future proofing.
-Developability – To be seen
.
.
IoTeX
A young project, made up of several accredited academics in cryptography, machine learning and data security. This is one of the most technically supported whitepapers I have read.They set out to solve scalability in the relay/subchain architecture proposed by Polkadot and used by INT. This architecture lends well to scaling and adaptability, as there is no end to the amount of subchains you can add to the network, given node and consensus bandwidth.
The way they look to address privacy is interesting. On the main parent (or relay) chain, they plan on implementing some of the technology from Monero, namely, ring signatures, bulletproofs and stealth addresses. While these are proven and respected technologies, this presents some worries as these techniques are known to not be lightweight and it takes away from the inherent generality of the core of the network. I believe the core should be as general and lightweight as possible to allow for scaling, ease of update, and adaptability. With adding this functionality, all data and transactions are made private and untraceable and therefore put through heavier computation. There are some applications where this is not optimal. A data stream may need to be read from many devices where encrypting it requires decryption for every use. A plain, public and traceable network would allow this simple use. This specificity should be made at the subchain level.
Subchains will have the ability to define their needs in terms of block times, smart contracting needs, etc. This lends to high applicability.
They address interoperability directly by laying out the framework for pegging (transaction on one chain causing a transaction on another), and cross-chain communication.
They do not address anywhere in the whitepaper the storage of data in the network. IoT devices will not be transaction only devices, they will need to maintain data, transmit data and query data. Without the ability to do so, the network will be crippled in its application.
IoTeX will use a variation of DPoS as the consensus mechanism. They are not specific on how this mechanism will work with no talk of data flow and node communication diagram. This will be their biggest hurdle and why I believe it was left out of the white paper. Cryptography and theory is easy to elaborate on within each specific subject but tying it all together, subchains with smart contracts, transacting with other side chains, with ring signatures, bulletproofs and stealth addresses on the main chain, will be a challenge that I am not sure can be done efficiently.
They may be well positioned to make this work but you are talking about having some of the core concepts of your network being based on problems that haven’t been solved and computationally heavy technologies, namely private transactions within smart contracts. So while all the theory and technical explanations make my pants tight, the realist in me will believe it when he sees it.
.
-Scalability – Neutral to medium, has the framework to address it with some issues that will hold it back.
-Applicability – Medium, has smart contract possibilities, privacy baked into network, no data framework
-Interoperability – Medium, inherent in the network design
-Consensus – Low, inherent private transactions may choke network. Consensus mechanism not at all laid out.
-Developability – To be seen, not mentioned.
.
.
CPChain
CPC puts a lot of their focus on data storage. They recognize that one of the core needs of an IoT network will be the ability to quickly store and reference large amounts of data and that this has to be separate from the transactional basis of the network as to not slow it down. They propose solving this using distributed hash tables (DHT) in the same fashion as INT, which stores data in a decentralized fashion so no one source owns the complete record. This system is much the same as the one used by BitTorrent, which allows data to be available regardless of which nodes will be online at a given time. The data privacy issue is solved by using client side encryption with one-to-many public key cryptography allowing many devices to decrypt a singly encrypted file while no two devices share the same key.
This data layer will be run on a separate, parallel chain as to not clog the network and to enable scalability. In spite of this, they don’t discuss how they will scale on the main chain. In order to partially solve this, it will use a two layer consensus structure centered on PoS to increase consensus efficiency. This two layer system will still require the main layer to do the entirety of the verification and block generation. This will be a scaling issue where the network will have no division of labor to segregate congestion to not affect the whole network.
They do recognize that the main chain would not be robust or reliable enough to handle high frequency or real-time devices and therefore propose side chains for those device types. Despite this, they are adding a significant amount of functionality (smart contracts, data interpretation) to the main chain instead of a more general and light weight main chain, which constrains the possible applications for the network and also makes it more difficult to upgrade the network.
So while this project, on the surface level (not very technical whitepaper), seems to be a robust and well thought out framework, it doesn’t lend itself to an all-encompassing IoT network but more for a narrower, data centric, IoT application.
.
-Scalability – Neutral to medium, has the framework to address it somewhat, too much responsibility and functionality on the main chain may slow it down.
-Applicability – Medium, has smart contract possibilities, elaborate data storage solution with privacy in mind as well has high frequency applications thought out
-Interoperability – Low, not discussed
-Consensus – Low to medium, discussed solution has high reliance on single chain
-Developability – To be seen, not mentioned.
.
.
ITC
The whitepaper reads like someone just grabbed some of the big hitters in crypto buzzword bingo and threw them in there and explained what they were using Wikipedia. It says nothing about how they will tie it all together, economically incentivize the security of the network or maintain the data structures. I have a feeling none of them actually have any idea how to do any of this. For Christ sake they explain blockchain as the core of the “Solutions” portion of their whitepaper. This project is not worth any more analysis.
.
.
RuffChain
Centralization and trust. Not very well thought out at this stage. DPoS consensus on a single chain. Not much more than that.
.
.
WaltonChain
Waltonchain focuses on tracking and validating the manufacture and shipping of items using RFID technology. The structure will have a main chain/subchain framework, which will allow the network to segregate traffic and infinitely scale by the addition of subchains given available nodes and main chain bandwidth.
DPoST (Stake & Trust) will be the core of their consensus mechanism, which adds trust to the traditional staking structure. This trust is based on the age of the coins in the staker’s node. The longer that node has held the coins, combined with the amount of coins held, the more likely that node will be elected to create the block. I am not sure how I feel about this but generally dislike trust.
Waltonchain's framework will also allow smart contracts on the main chain. Again, this level of main chain specificity worries me at scale and difficulty in upgrading. This smart contract core also does not lend itself to private transactions. In this small subset of IoT ecosystem, that does not matter as the whole basis of tracking is open and public records.
The whitepaper is not very technical so I cannot comment to their technical completeness or exact implementation strategy.
This implementation of the relay/subchain framework is a very narrow and under-utilized application. As I said before, WTC may someday just be one part of a larger IoT ecosystem while interacting with another IoT network. This will not be an all-encompassing network.
.
-Scalability – High, main/subchain framework infinitely scales
-Applicability – Low to medium, their application is narrow
-Interoperability – Medium, the framework will allow it seamlessly
-Consensus – Neutral, should not choke the network but adds trust to the equation
-Developability – N/A, this is a more centralized project and development will likely be with the WTC
.
.
VeChain
\*Let me preface this by saying I realize there is a place for centralized, corporatized, non-open source projects in this space.* Although I know this project is focused mainly on wider, more general business uses for blockchain, I was requested to include it in this analysis. I have edited my original comment as it was more opinionated and therefore determined not to be productive to the conversation. If you would like to get a feel for my opinion, the original text is in the comments below.\**
This project doesn't have much data to go off as the white paper does not contain much technical detail. It is focused on how they are positioning themselves to enable wider adoption of blockchain technology in the corporate ecosystem.
They also spend a fair amount of time covering their node structure and planned governance. What this reveals is a PoS and PoA combined system with levels of nodes and related reward. Several of the node types require KYC (Know Your Customer) to establish trust in order to be part of the block creating pool.
Again there is not much technically that we can glean from this whitepaper. What is known is that this is not directed at a IoT market and will be a PoS and PoA Ethereum-like network with trusted node setup.
I will leave out the grading points as there is not enough information to properly determine where they are at.
.
.
.
INT
So under this same lens, how does INT stack up? INT borrows their framework from Polkadot, which is a relay/subchain architecture. This framework allows for infinite scaling by the addition of subchains given available nodes and relay chain bandwidth. Custom functionality in subchains allows the one setting up the subchain to define the requirements, be it private transactions, state transaction free data chain, smart contracts, etc. This also lends to endless applicability. The main chain is inherently simple in it’s functionality as to not restrict any uses or future updates in technology or advances.
The consensus structure also takes a novel two-tiered approach in separating validating from block generation in an effort to further enable scaling by removing the block generation choke point from the side chains to the central relay chain. This leaves the subchain nodes to only validate transactions with a light DPoS allowing a free flowing transaction highway.
INT also recognizes the strong need for an IoT network to have robust and efficient data handling and storage. They are utilizing a decentralize storage system using DHT much like the BitTorrent system. This combined with the network implementation of all of the communication protocols (TCP/IP, UDP/IP, MANET) build the framework of a network that will effortlessly integrate any device type for any application.
The multi-chain framework easily accommodates interoperability between established networks like the Internet and enables pegging with other blockchains with a few simple transaction type inclusions. With this cross chain communication, manufactures wouldn’t have to negotiate their needs to fit an established blockchain, they could create their own subchain to fit their needs and interact with the greater network through the relay.
The team also understands the development hurdles facing the environment. They plan to solve this by standardizing requirements for communication and data exchange. They have heavy ties with several manufacturers and are currently developing a IoT router to be the gateway to the network.
.
-Scalability – High, relay/subchain framework enables infinite scalability
-Applicability – High, highest I could find for IoT. Subchains can be created for every possible application.
-Interoperability – High, able to add established networks for data support and cross chain transactions
-Consensus – High, the only structure that separates the two responsibilities of verifying and block generation to further enable scaling and not choke applicability.
-Developability – Medium, network is set up for ease of development with well-known language and subchain capability. Already working with device manufacturers. To be seen.
.
.
So with all that said, INT may be in the best place to tackle this space with their chosen framework and philosophy. They set out to accomplish more than WTC or VEN in a network that is better equipped than IOTA or Ethereum. If they can excecute on what they have laid out, there is no reason that they won’t become the market leader, easily overtaking the market cap of VeChain ($2.5Bn, $10 INT) in the short term and IOTA ($7Bn, $28 INT) in the medium term.
submitted by Graytrain to INT_Chain [link] [comments]

DAG Technology Analysis and Measurement

The report produced by the fire block chain coins Institute, author: Yuan Yuming, Hu Zhiwei, PDF version please read the original text download
Summary
The Fire Coin Blockchain Application Research Institute conducts research on distributed ledger technology based on directed acyclic graph (DAG) data structure from a technical perspective, and through the specific technical test of typical representative project IOTA, the main research results are obtained:
Report body
1 Introduction
Blockchain is a distributed ledger technology, and distributed ledger technology is not limited to the "blockchain" technology. In the wave of digital economic development, more distributed ledger technology is being explored and applied in order to improve the original technology and meet more practical business application scenarios. Directed Acylic Graph (hereinafter referred to as "DAG") is one of the representatives.
What is DAG technology and the design behind it? What is the actual application effect?We attempted to obtain analytical conclusions through deep analysis of DAG technology and actual test runs of representative project IOTA.
It should also be noted that the results of the indicator data obtained from the test are not and should not be considered as proof or confirmation of the final effect of the IOTA platform or project. Hereby declare.
2. Main conclusions
After research and test analysis, we have the following main conclusions and technical recommendations:
3.DAG Introduction
3.1. Introduction to DAG Principle
DAG (Directed Acyclic Graph) is a data structure that represents a directed graph, and in this graph, it cannot return to this point (no loop) from any vertex, as shown in the figure. Shown as follows:
📷
After the DAG technology-based distributed ledger (hereinafter referred to as DAG) technology has been proposed in recent years, many people think that it is hopeful to replace the blockchain technology in the narrow sense. Because the goal of DAG at design time is to preserve the advantages of the blockchain and to improve the shortcomings of the blockchain.
Different from the traditional linear blockchain structure, the transaction record of the distributed ledger platform represented by IOTA forms a relational structure with a directed acyclic graph, as shown in the following figure.
📷
3.2. DAG characteristics
Due to the different data structure from the previous blockchain, the DAG-based distributed ledger technology has the characteristics of high scalability, high concurrency and is suitable for IoT scenarios.
3.2.1. High scalability, high concurrency
The data synchronization mechanism of traditional linear blockchains (such as Ethereum) is synchronous, which may cause network congestion. The DAG network adopts an asynchronous communication mechanism, allowing concurrent writing. Multiple nodes can simultaneously trade at different tempos without having a clear sequence. Therefore, the data of the network may be inconsistent at the same time, but it will eventually be synchronized.

3.2.2. Applicable to IoT scenarios

In the traditional blockchain network, there are many transactions in each block. The miners are packaged and sent uniformly, involving multiple users. In the DAG network, there is no concept of “block”, the smallest unit of the network. It is a "transaction", each new transaction needs to verify the first two transactions, so the DAG network does not need miners to pass the trust, transfer does not require a fee, which makes DAG technology suitable for small payments.
4. Analysis of technical ideas
Trilemma, or "trilemma", means that in a particular situation, only two of the three advantageous options can be selected or one of the three adverse choices must be chosen. This type of selection dilemma has related cases in various fields such as religion, law, philosophy, economics, and business management.Blockchain is no exception. The impossible triangle in the blockchain is: Scalability, Decentralization, and Security can only choose two of them.
If you analyze DAG technology according to this idea, according to the previous introduction, then DAG has undoubtedly occupied the two aspects of decentralization and scalability. The decentralization and scalability of the DAG can be considered as two-sided, because of the asynchronous accounting features brought about by the DAG data structure, while achieving the high degree of decentralization of the participating network nodes and the scalability of the transaction.
5. There is a problem
Since the characteristics of the data structure bring decentralization and scalability at the same time, it is speculated that the security is a hidden danger according to the theory of impossible triangles. But because DAG is a relatively innovative and special structure, can it be more perfect to achieve security? This is not the case from the actual results.
5.1. Double flower problem
The characteristics of DAG asynchronous communication make it possible for a double-flower attack. For example, an attacker adds two conflicting transactions (double spending) at two different locations on the network, and the transactions are continuously forward-checked in the network until they appear on the verification path of the same transaction, and the network discovers the conflict. At this time, the common ancestor nodes that the two transactions are gathered together can determine which transaction is a double-flower attack.
If the trading path is too short, there will be a problem like "Blowball": when most transactions are "lazy" in extreme cases, only the early trading, the trading network will form a minority. Early transactions are the core central topology. This is not a good thing for DAGs that rely on ever-increasing transactions to increase network reliability.
Therefore, at present, for the double flower problem, it is necessary to comprehensively consider the actual situation for design. Different DAG networks have their own solutions.
5.2. Shadow chain problem
Due to the potential problem of double flowers, when an attacker can build a sufficient number of transactions, it is possible to fork a fraudulent branch (shadow chain) from the real network data, which contains a double flower transaction, and then this The branch is merged into the DAG network, and in this case it is possible for this branch to replace the original transaction data.
6. Introduction to the current improvement plan
At present, the project mainly guarantees safety by sacrificing the native characteristics of some DAGs.
The IOTA project uses the Markov chain Monte Carlo (MCMC) approach to solve this problem. The IOTA introduces the concept of Cumulative Weight for transactions to record the number of times the transaction has been cited in order to indicate the importance of its transaction. The MCMC algorithm selects the existing transactions in the current network as a reference for the newly added transactions by weighting the random weights of the accumulated weights. That is, the more referenced the transaction path, the easier it is to be selected by the algorithm. The walk strategy has also been optimized in version 1.5.0 to control the "width" of the transaction topology to a reasonable range, making the network more secure.
However, at the beginning of the platform startup, due to the limited number of participating nodes and transactions, it is difficult to prevent a malicious organization from sending a large number of malicious transactions through a large number of nodes to cause the entire network to be attacked by the shadow chain. Therefore, an authoritative arbitration institution is needed to determine the validity of the transaction. In IOTA, this node is a Coordinator, which periodically snapshots the current transaction data network (Tangle); the transactions contained in the snapshot are confirmed as valid transactions. But Coordinator doesn't always exist. As the entire network runs and grows, IOTA will cancel the Coordinator at some point in the future.
The Byteball improvement program features its design for the witness and the main chain. Because the structure of DAG brings a lot of transactions with partial order, and to avoid double flowers, it is necessary to establish a full order relationship for these transactions to form a transaction backbone. An earlier transaction on the main chain is considered a valid transaction.Witnesses, who are held by well-known users or institutions, form a main chain by constantly sending transactions to confirm other user transactions.
The above scheme may also bring different changes to the platform based on the DAG structure. Taking IOTA as an example, because of the introduction of Coordinator, the decentralization characteristics are reduced to some extent.
7. Actual operation
7.1. Positive effects
In addition to solving security problems, the above solutions can also solve the smart contract problem to some extent.
Due to the two potential problems caused by the native features of DAG: (1) The transaction duration is uncontrollable. The current mechanism for requesting retransmission requires some complicated timeout mechanism design on the client side, hoping for a simple one-time confirmation mechanism. (2) There is no global sorting mechanism, which results in limited types of operations supported by the system. Therefore, on the distributed ledger platform based on DAG technology, it is difficult to implement Turing's complete intelligent contract system.
In order to ensure that the smart contract can run, an organization is needed to do the above work. The current Coordinator or main chain can achieve similar results.
7.2. Negative effects
As one of the most intuitive indicators, DAG's TPS should theoretically be unlimited. If the maximum TPS of the IOTA platform is compared to the capacity of a factory, then the daily operation of TPS is the daily production of the plant.
For the largest TPS, the April 2017 IOTA stress test showed that the network had transaction processing capabilities of 112 CTPS and 895 TPS. This is the result of a small test network consisting of 250 nodes.
For the daily operation of TPS, from the data that is currently publicly available, the average TPS of the main network in the near future is about 8.2, and the CTPS (the number of confirmed transactions per second) is about 2.7.
📷
The average average TPS of the test network is about 4, and the CTPS is about 3.
📷
Data source discord bot: generic-iota-bot#5760
Is this related to the existence of Coordinator? Actual testing is needed to further demonstrate.
8. Measured analysis
The operational statistics of the open test network are related to many factors.For further analysis, we continue to use the IOTA platform as an example to build a private test environment for technical measurement analysis.
8.1. Test Architecture
The relationship between the components we built this test is shown below.
📷
among them:
8.2. Testing the hardware environment
The server uses Amazon AWS EC2 C5.4xlarge: 16 core 3GHz, Intel Xeon Platinum 8124M CPU, 32GB memory, 10Gbps LAN network between servers, communication delay (ping) is less than 1ms, operating system is Ubuntu 16.04.
8.3. Test scenarios and results analysis

8.3.1. Default PoW Difficulty Value

Although there is no concept such as “miners”, the IOTA node still needs to prove the workload before sending the transaction to avoid sending a large number of transactions to flood the network. The Minimum Weight Magnitude is similar to Bitcoin. The result of PoW should be the number of digits of "9", 9 of which is "000" in the ternary used by IOTA. The IOTA difficulty value can be set before the node is started.
Currently for the production network, the difficulty value of the IOTA is set to 14; the test network is set to 9. Therefore, we first use the test network's default difficulty value of 9 to test, get the following test results.
📷
Since each IOTA's bundle contains multiple transfers, the actual processed TPS will be higher than the send rate. But by executing the script that parses zmq, it can be observed that the current TPS is very low. Another phenomenon is that the number of requests that can be sent successfully per second is also low.
After analysis, the reason is that the test uses VPS, so in PoW, the CPU is mainly used for calculation, so the transaction speed is mainly affected by the transmission speed.

8.3.2. Decrease the PoW difficulty value

Re-test the difficulty value to 1 and get the following results.
📷
As can be seen from the results, TPS will increase after the difficulty is reduced. Therefore, the current TPS of the IOTA project does not reach the bottleneck where the Coordinator is located, but mainly because of the hardware and network of the client itself that sends the transaction. The IOTA community is currently working on the implementation of FPGA-based Curl algorithm and CPU instruction set optimization. Our test results also confirm that we can continue to explore the performance potential of the DAG platform in this way.

8.3.3. Reduce the number of test network nodes

Due to the characteristics of DAG, the actual TPS of the platform and the number of network nodes may also be related. Therefore, when the difficulty value is kept at 1, the number of network nodes is reduced to 10 and the test is repeated to obtain the following results.
📷
As can be seen from the results, as the number of nodes decreases, the actual processing of TPS also decreases, and is lower than the transmission rate. This shows that in a DAG environment, maintaining a sufficient size node will facilitate the processing of the transaction.
9. Reference materials
Https://www.iota.org/
https://en.wikipedia.org/wiki/Trilemma
Https://blog.iota.org/new-tip-selection-algorithm-in-iri-1-5-0-61294c1df6f1
https://en.wikipedia.org/wiki/Markov\_chain\_Monte\_Carlo
Https://byteball.org/
Https://www.iotachina.com/iota.html
Https://www.iotachina.com/iota\_tutorial\_1.html
submitted by i0tal0ver to Iota [link] [comments]

AMD's Growing CPU Advantage Over Intel

https://seekingalpha.com/article/4152240-amds-growing-cpu-advantage-intel?page=1
AMD's Growing CPU Advantage Over Intel Mar. 1.18 | About: Advanced Micro (AMD)
Raymond Caron, Ph.D. Tech, solar, natural resources, energy (315 followers) Summary AMD's past and economic hazards. AMD's Current market conditions. AMD Zen CPU advantage over Intel. AMD is primarily a CPU fabrication company with much experience and a great history in that respect. They hold patents for 64-bit processing, as well as ARM based processing patents, and GPU architecture patents. AMD built a name for itself in the mid-to-late 90’s when they introduced the K-series CPU’s to good reviews followed by the Athlon series in ‘99. AMD was profitable, they bought the companies NexGen, Alchemy Semiconductor, and ATI. Past Economic Hazards If AMD has such a great history, then what happened? Before I go over the technical advantage that AMD has over Intel, it’s worth looking to see how AMD failed in the past, and to see if those hazards still present a risk to AMD. As for investment purposes we’re more interested in AMD’s turning a profit. AMD suffered from intermittent CPU fabrication problems, and was also the victim of sustained anti-competitive behaviour from Intel who interfered with AMD’s attempts to sell its CPU’s to the market through Sony, Hitachi, Toshiba, Fujitsu, NEC, Dell, Gateway, HP, Acer, and Lenovo. Intel was investigated and/or fined by multiple countries including Japan, Korea, USA, and EU. These hazard needs to be examined to see if history will repeat itself. There have been some rather large changes in the market since then.
1) The EU has shown they are not averse to leveling large fines, and Intel is still fighting the guilty verdict from the last EU fine levied against them; they’ve already lost one appeal. It’s conceivable to expect that the EU, and other countries, would prosecute Intel again. This is compounded by the recent security problems with Intel CPU’s and the fact that Intel sold these CPU’s under false advertising as secure when Intel knew they were not. Here are some of the largest fines dished out by the EU
2) The Internet has evolved from Web 1.0 to 2.0. Consumers are increasing their online presence each year. This reduces the clout that Intel can wield over the market as AMD can more easily sell to consumers through smaller Internet based companies.
3) Traditional distributors (HP, Dell, Lenovo, etc.) are struggling. All of these companies have had recent issues with declining revenue due to Internet competition, and ARM competition. These companies are struggling for sales and this reduces the clout that Intel has over them, as Intel is no longer able to ensure their future. It no longer pays to be in the club. These points are summarized in the graph below, from Statista, which shows “ODM Direct” sales and “other sales” increasing their market share from 2009 to Q3 2017. 4) AMD spun off Global Foundries as a separate company. AMD has a fabrication agreement with Global Foundries, but is also free to fabricate at another foundry such as TSMC, where AMD has recently announced they will be printing Vega at 7nm.
5) Global Foundries developed the capability to fabricate at 16nm, 14nm, and 12nm alongside Samsung, and IBM, and bought the process from IBM to fabricate at 7nm. These three companies have been cooperating to develop new fabrication nodes.
6) The computer market has grown much larger since the mid-90’s – 2006 when AMD last had a significant tangible advantage over Intel, as computer sales rose steadily until 2011 before starting a slow decline, see Statista graph below. The decline corresponds directly to the loss of competition in the marketplace between AMD and Intel, when AMD released the Bulldozer CPU in 2011. Tablets also became available starting in 2010 and contributed to the fall in computer sales which started falling in 2012. It’s important to note that computer shipments did not fall in 2017, they remained static, and AMD’s GPU market share rose in Q4 2017 at the expense of Nvidia and Intel.
7) In terms of fabrication, AMD has access to 7nm on Global Foundries as well as through TSMC. It’s unlikely that AMD will experience CPU fabrication problems in the future. This is something of a reversal of fortunes as Intel is now experiencing issues with its 10nm fabrication facilities which are behind schedule by more than 2 years, and maybe longer. It would be costly for Intel to use another foundry to print their CPU’s due to the overhead that their current foundries have on their bottom line. If Intel is unable to get the 10nm process working, they’re going to have difficulty competing with AMD. AMD: Current market conditions In 2011 AMD released its Bulldozer line of CPU’s to poor reviews and was relegated to selling on the discount market where sales margins are low. Since that time AMD’s profits have been largely determined by the performance of its GPU and Semi-Custom business. Analysts have become accustomed to looking at AMD’s revenue from a GPU perspective, which isn’t currently being seen in a positive light due to the relation between AMD GPU’s and cryptocurrency mining.
The market views cryptocurrency as further risk to AMD. When Bitcoin was introduced it was also mined with GPU’s. When the currency switched to ASIC circuits (a basic inexpensive and simple circuit) for increased profitability (ASIC’s are cheaper because they’re simple), the GPU’s purchased for mining were resold on the market and ended up competing with and hurting new AMD GPU sales. There is also perceived risk to AMD from Nvidia which has favorable reviews for its Pascal GPU offerings. While AMD has been selling GPU’s they haven’t increased GPU supply due to cryptocurrency demand, while Nvidia has. This resulted in a very high cost for AMD GPU’s relative to Nvidia’s. There are strategic reasons for AMD’s current position:
1) While the AMD GPU’s are profitable and greatly desired for cryptocurrency mining, AMD’s market access is through 3rd party resellers whom enjoy the revenue from marked-up GPU sales. AMD most likely makes lower margins on GPU sales relative to the Zen CPU sales due to higher fabrication costs associated with the fabrication of larger size dies and the corresponding lower yield. For reference I’ve included the size of AMD’s and Nvidia’s GPU’s as well as AMD’s Ryzen CPU and Intel’s Coffee lake 8th generation CPU. This suggests that if AMD had to pick and choose between products, they’d focus on Zen due higher yield and revenue from sales and an increase in margin.
2) If AMD maintained historical levels of GPU production in the face of cryptocurrency demand, while increasing production for Zen products, they would maximize potential income for highest margin products (EPYC), while reducing future vulnerability to second-hand GPU sales being resold on the market. 3) AMD was burned in the past from second hand GPU’s and want to avoid repeating that experience. AMD stated several times that the cryptocurrency boom was not factored into forward looking statements, meaning they haven’t produced more GPU’s to expect more GPU sales.
In contrast, Nvidia increased its production of GPU’s due to cryptocurrency demand, as AMD did in the past. Since their Pascal GPU has entered its 2nd year on the market and is capable of running video games for years to come (1080p and 4k gaming), Nvidia will be entering a position where they will be competing directly with older GPU’s used for mining, that are as capable as the cards Nvidia is currently selling. Second-hand GPU’s from mining are known to function very well, with only a need to replace the fan. This is because semiconductors work best in a steady state, as opposed to being turned on and off, so it will endure less wear when used 24/7.
The market is also pessimistic regarding AMD’s P/E ratio. The market is accustomed to evaluating stocks using the P/E ratio. This statistical test is not actually accurate in evaluating new companies, or companies going into or coming out of bankruptcy. It is more accurate in evaluating companies that have a consistent business operating trend over time.
“Similarly, a company with very low earnings now may command a very high P/E ratio even though it isn’t necessarily overvalued. The company may have just IPO’d and growth expectations are very high, or expectations remain high since the company dominates the technology in its space.” P/E Ratio: Problems With The P/E I regard the pessimism surrounding AMD stock due to GPU’s and past history as a positive trait, because the threat is minor. While AMD is experiencing competitive problems with its GPU’s in gaming AMD holds an advantage in Blockchain processing which stands to be a larger and more lucrative market. I also believe that AMD’s progress with Zen, particularly with EPYC and the recent Meltdown related security and performance issues with all Intel CPU offerings far outweigh any GPU turbulence. This turns the pessimism surrounding AMD regarding its GPU’s into a stock benefit. 1) A pessimistic group prevents the stock from becoming a bubble. -It provides a counter argument against hype relating to product launches that are not proven by earnings. Which is unfortunately a historical trend for AMD as they have had difficulty selling server CPU’s, and consumer CPU’s in the past due to market interference by Intel. 2) It creates predictable daily, weekly, monthly, quarterly fluctuations in the stock price that can be used, to generate income. 3) Due to recent product launches and market conditions (Zen architecture advantage, 12nm node launching, Meltdown performance flaw affecting all Intel CPU’s, Intel’s problems with 10nm) and the fact that AMD is once again selling a competitive product, AMD is making more money each quarter. Therefore the base price of AMD’s stock will rise with earnings, as we’re seeing. This is also a form of investment security, where perceived losses are returned over time, due to a stock that is in a long-term upward trajectory due to new products reaching a responsive market.
4) AMD remains a cheap stock. While it’s volatile it’s stuck in a long-term upward trend due to market conditions and new product launches. An investor can buy more stock (with a limited budget) to maximize earnings. This is advantage also means that the stock is more easily manipulated, as seen during the Q3 2017 ER.
5) The pessimism is unfounded. The cryptocurrency craze hasn’t died, it increased – fell – and recovered. The second hand market did not see an influx of mining GPU’s as mining remains profitable.
6) Blockchain is an emerging market, that will eclipse the gaming market in size due to the wide breath of applications across various industries. Vega is a highly desired product for Blockchain applications as AMD has retained a processing and performance advantage over Nvidia. There are more and rapidly growing applications for Blockchain every day, all (or most) of which will require GPU’s. For instance Microsoft, The Golem supercomputer, IBM, HP, Oracle, Red Hat, and others. Long-term upwards trend AMD is at the beginning of a long-term upward trend supported by a comprehensive and competitive product portfolio that is still being delivered to the market, AMD referred to this as product ramping. AMD’s most effective products with Zen is EPYC, and the Raven Ridge APU. EPYC entered the market in mid-December and was completely sold out by mid-January, but has since been restocked. Intel remains uncompetitive in that industry as their CPU offerings are retarded by a 40% performance flaw due to Meltdown patches. Server CPU sales command the highest margins for both Intel and AMD.
The AMD Raven Ridge APU was recently released to excellent reviews. The APU is significant due to high GPU prices driven buy cryptocurrency, and the fact that the APU is a CPU/GPU hybrid which has the performance to play games available today at 1080p. The APU also supports the Vulcan API, which can call upon multiple GPU’s to increase performance, so a system can be upgraded with an AMD or Nvidia GPU that supports Vulcan API at a later date for increased performance for those games or workloads that been programmed to support it. Or the APU can be replaced when the prices of GPU’s fall.
AMD also stands to benefit as Intel confirmed that their new 10 nm fabrication node is behind in technical capability relative to the Samsung, TSMC, and Global Foundries 7 nm fabrication process. This brings into questions Intel’s competitiveness in 2019 and beyond. Take-Away • AMD was uncompetitive with respect to CPU’s from 2011 to 2017 • When AMD was competitive, from 1996 to 2011 they did record profit and bought 3 companies including ATI. • AMD CPU business suffered from: • Market manipulation from Intel. • Intel fined by EU, Japan, Korea, and settled with the USA • Foundry productivity and upgrade complications • AMD has changed • Global Foundries spun off as an independent business • Has developed 14nm &12nm, and is implementing 7nm fabrication • Intel late on 10nm, is less competitive than 7nm node • AMD to fabricate products using multiple foundries (TSMC, Global Foundries) • The market has changed • More AMD products are available on the Internet and both the adoption of the Internet and the size of the Internet retail market has exploded, thanks to the success of smartphones and tablets. • Consumer habits have changed, more people shop online each year. Traditional retailers have lost market share. • Computer market is larger (on-average), but has been declining. While Computer shipments declined in Q2 and Q3 2017, AMD sold more CPU’s. • AMD was uncompetitive with respect to CPU’s from 2011 to 2017. • Analysts look to GPU and Semi-Custom sales for revenue. • Cryptocurrency boom intensified, no crash occurred. • AMD did not increase GPU production to meet cryptocurrency demand. • Blockchain represents a new growth potential for AMD GPU’s. • Pessimism acts as security against a stock bubble & corresponding bust. • Creates cyclical volatility in the stock that can be used to generate profit. • P/E ratio is misleading when used to evaluate AMD. • AMD has long-term growth potential. • 2017 AMD releases competitive product portfolio. • Since Zen was released in March 2017 AMD has beat ER expectations. • AMD returns to profitability in 2017. • AMD taking measureable market share from Intel in OEM CPU Desktop and in CPU market. • High margin server product EPYC released in December 2017 before worst ever CPU security bug found in Intel CPU’s that are hit with detrimental 40% performance patch. • Ryzen APU (Raven Ridge) announced in February 2018, to meet gaming GPU shortage created by high GPU demand for cryptocurrency mining. • Blockchain is a long-term growth opportunity for AMD. • Intel is behind the competition for the next CPU fabrication node. AMD’s growing CPU advantage over Intel About AMD’s Zen Zen is a technical breakthrough in CPU architecture because it’s a modular design and because it is a small CPU while providing similar or better performance than the Intel competition.
Since Zen was released in March 2017, we’ve seen AMD go from 18% CPU market share in the OEM consumer desktops to essentially 50% market share, this was also supported by comments from Lisa Su during the Q3 2017 ER call, by MindFactory.de, and by Amazon sales of CPU’s. We also saw AMD increase its market share of total desktop CPU’s. We also started seeing market share flux between AMD and Intel as new CPU’s are released. Zen is a technical breakthrough supported by a few general guidelines relating to electronics. This provides AMD with an across the board CPU market advantage over Intel for every CPU market addressed.
1) The larger the CPU the lower the yield. - Zen architecture that makes up Ryzen, Threadripper, and EPYC is smaller (44 mm2 compared to 151 mm2 for Coffee Lake). A larger CPU means fewer CPU’s made during fabrication per wafer. AMD will have roughly 3x the fabrication yield for each Zen printed compared to each Coffee Lake printed, therefore each CPU has a much lower cost of manufacturing.
2) The larger the CPU the harder it is to fabricate without errors. - The chance that a CPU will be perfectly fabricated falls exponentially with increasing surface area. Intel will have fewer high quality CPU’s printed compared to AMD. This means that AMD will make a higher margin on each CPU sold. AMD’s supply of perfect printed Ryzen’s (1800X) are so high that the company had to give them away at a reduced cost in order to meet supply demands for the cheaper Ryzen 5 1600X. If you bought a 1600X in August/September, you probably ended up with an 1800X.
3) Larger CPU’s are harder to fabricate without errors on smaller nodes. -The technical capability to fabricate CPU’s at smaller nodes becomes more difficult due to the higher precision that is required to fabricate at a smaller node, and due to the corresponding increase in errors. “A second reason for the slowdown is that it’s simply getting harder to design, inspect and test chips at advanced nodes. Physical effects such as heat, electrostatic discharge and electromagnetic interference are more pronounced at 7nm than at 28nm. It also takes more power to drive signals through skinny wires, and circuits are more sensitive to test and inspection, as well as to thermal migration across a chip. All of that needs to be accounted for and simulated using multi-physics simulation, emulation and prototyping.“ Is 7nm The Last Major Node? “Simply put, the first generation of 10nm requires small processors to ensure high yields. Intel seems to be putting the smaller die sizes (i.e. anything under 15W for a laptop) into the 10nm Cannon Lake bucket, while the larger 35W+ chips will be on 14++ Coffee Lake, a tried and tested sub-node for larger CPUs. While the desktop sits on 14++ for a bit longer, it gives time for Intel to further develop their 10nm fabrication abilities, leading to their 10+ process for larger chips by working their other large chip segments (FPGA, MIC) first.” There are plenty of steps where errors can be created within a fabricated CPU. This is most likely the culprit behind Intel’s inability to launch its 10nm fabrication process. They’re simply unable to print such a large CPU on such a small node with high enough yields to make the process competitive. Intel thought they were ahead of the competition with respect to printing large CPU’s on a small node, until AMD avoided the issue completely by designing a smaller modular CPU. Intel avoided any mention of its 10nm node during its Q4 2017 ER, which I interpret as bad news for Intel shareholders. If you have nothing good to say, then you don’t say anything. Intel having nothing to say about something that is fundamentally critical to its success as a company can’t be good. Intel is on track however to deliver hybrid CPU’s where some small components are printed on 10nm. It’s recently also come to light that Intel’s 10nm node is less competitive than the Global Foundries, Samsung, and TSMC 7nm nodes, which means that Intel is now firmly behind in CPU fabrication. 4) AMD Zen is a new architecture built from the ground up. Intel’s CPU’s are built on-top of older architecture developed with 30-yr old strategies, some of which we’ve recently discovered are flawed. This resulted in the Meltdown flaw, the Spectre flaws, and also includes the ME, and AMT bugs in Intel CPU’s. While AMD is still affected by Spectre, AMD has only ever acknowledged that they’re completely susceptible to Spectre 1, as AMD considers Spectre 2 to be difficult to exploit on an AMD Zen CPU. “It is much more difficult on all AMD CPUs, because BTB entries are not aliased - the attacker must know (and be able to execute arbitrary code at) the exact address of the targeted branch instruction.” Technical Analysis of Spectre & Meltdown * Amd Further reading Spectre and Meltdown: Linux creator Linus Torvalds criticises Intel's 'garbage' patches | ZDNet FYI: Processor bugs are everywhere - just ask Intel and AMD Meltdown and Spectre: Good news for AMD users, (more) bad news for Intel Cybersecurity agency: The only sure defense against huge chip flaw is a new chip Kernel-memory-leaking Intel processor design flaw forces Linux, Windows redesign Take-Away • AMD Zen enjoys a CPU fabrication yield advantage over Intel • AMD Zen enjoys higher yield of high quality CPU’s • Intel’s CPU’s are affected with 40% performance drop due to Meltdown flaw that affect server CPU sales.
AMD stock drivers 1) EPYC • -A critically acclaimed CPU that is sold at a discount compared to Intel. • -Is not affected by 40% software slow-downs due to Meltdown. 2) Raven Ridge desktop APU • - Targets unfed GPU market which has been stifled due to cryptocurrency demand - Customers can upgrade to a new CPU or add a GPU at a later date without changing the motherboard. • - AM4 motherboard supported until 2020. 3) Vega GPU sales to Intel for 8th generation CPU’s with integrated graphics. • - AMD gains access to the complete desktop and mobile market through Intel.
4) Mobile Ryzen APU sales • -Providing gaming capability in a compact power envelope.
5) Ryzen and Threadripper sales • -Fabricated on 12nm in April. • -May eliminate Intel’s last remaining CPU advantage in IPC single core processing. • -AM4 motherboard supported until 2020. • -7nm Ryzen on track for early 2019. 6) Others: Vega, Polaris, Semi-custom, etc. • -I consider any positive developments here to be gravy. Conclusion While in the past Intel interfered with AMD's ability to bring it's products to market, the market has changed. The internet has grown significantly and is now a large market that dominates when in computer sales. It's questionable if Intel still has the influence to affect this new market, and doing so would most certainly result in fines and further bad press.
AMD's foundry problems were turned into an advantage over Intel.
AMD's more recent past was heavily influenced by the failure of the Bulldozer line of CPU's that dragged on AMD's bottom line from 2011 to 2017.
AMD's Zen line of CPU's is a breakthrough that exploits an alternative, superior strategy, in chip design which results in a smaller CPU. A smaller CPU enjoys compounded yield and quality advantages over Intel's CPU architecture. Intel's lead in CPU performance will at the very least be challenged and will more likely come to an end in 2018, until they release a redesigned CPU.
I previously targeted AMD to be worth $20 by the end of Q4 2017 ER. This was based on the speed that Intel was able to get products to market, in comparison AMD is much slower. I believe the stock should be there, but the GPU related story was prominent due to cryptocurrency craze. Financial analysts need more time to catch on to what’s happening with AMD, they need an ER that is driven by CPU sales. I believe that the Q1 2018 is the ER to do that. AMD had EPYC stock in stores when the Meltdown and Spectre flaws hit the news. These CPU’s were sold out by mid-January and are large margin sales.
There are many variables at play within the market, however barring any disruptions I’d expect that AMD will be worth $20 at some point in 2018 due these market drivers. If AMD sold enough EPYC CPU’s due to Intel’s ongoing CPU security problems, then it may occur following the ER in Q1 2018. However, if anything is customary with AMD, it’s that these things always take longer than expected.
submitted by kchia124 to AMD_Stock [link] [comments]

Will Vite be strong with Smart Contracts in DAG?

What is “DAG + Smart Contract”?

Introduction
Compared to traditional Blockchain, implementing smart contracts based on a DAG (Directed Acyclic Graph) ledger is undoubtedly more challenging. So what exactly are these difficulties and where do they come from?

Cryptocurrency
First, let’s start with the most basic scenario — cryptocurrency. The concept of a cryptocurrency is a distributed database that stores the balance information for each account. The computers running in a cryptocurrency network are referred to as Nodes. Each node stores a piece of data about the account balance, which is often referred to as State.

Unlike traditional centralized banking systems, the distributed ledger requires all nodes to reach consensus on the state. In other words, the balance for each account stored on each computer in the network is the same. Since state data may be large in certain systems, transmitting full states causes huge network overhead, therefore, in such systems only events that change states are transmitted, called Transactions. To extrapolate on this fact, your money will not increase or decrease without. The account balance will only change when you pay someone or you are paid. Therefore, as long as you have all the historical transaction records of an account, you can calculate the balance at ease.

In the cryptocurrency system, all transactions are recorded in a data structure called the ledger. The ledger is encrypted by cryptography so that each node can verify whether the data transmitted from other nodes have been tampered with. Blockchain is a classic book structure, and DAG is another.

Blockchain-Based Cryptocurrency
Here the question comes, if the ledger obtained by each node is exactly the same, are they able to calculate out the same states? First, let’s look at the ledger structure of the blockchain. In this structure, all transactions are ordered. Changing the order of any two arbitrary transactions will crash the hash reference relationship of the blockchain, thus creating an invalid ledger. Therefore, starting from the same initial state no matter which node performs the calculation, the same result will always present after the same transaction sequence. Exactly perfect, isn’t it? No matter whether it is Bitcoin or Ethereum, there is no need to transfer and compare huge state data between nodes, instead, all that is necessary to do is to reach a consensus on the ledger data. The information contained in the ledger is adequate for a node to calculate the correct state.


DAG-Based Cryptocurrency
Let’s see if the DAG ledger has such a feature. Fortunately, in a DAG ledger, although the order between some transactions is not deterministic, correct ledger state calculation is not yet impacted because the addition and subtraction of cryptocurrency balance can satisfy the commutative law. As long as the balance of any account is not less than 0, the order of the transaction is not essential at all. Therefore, no matter how the DAG ledger is traversed, the final calculated account balance will be the same. In other words, any node can restore the correct state from the DAG ledger.


Smart Contract
After talking about the cryptocurrency, let’s take a look at smart contracts. In the real world, there are plenty of scenarios require more than just recording the balance of all accounts. For example, a flight ticket booking application needs to record the seat information in the state. At this time, the transaction is no longer just currency transfer, but may contain request data for a smart contract, such as a ticket booking request. At this time, changing the order of the two transactions may result in different states. For example, if both Alice and Bob try to book the same flight seat, this seat will only be allocated to the person who books first. In a smart contract scenario, the commutative law is no longer satisfied between transactions, so it must also take into account transaction sequence.


So how is it determined which transactions must be ordered and which transactions are ordered irrelevant? An ideal solution is writing a function to determine whether exchanging any two arbitrary transactions order will affect the final state, according to the smart contract logic. If such a function exists, we would know which transactions must be linked by hash reference in the DAG ledger, as the above diagram shows any two order-relevant transactions are connected by an arrow. Unfortunately, this function is computationally expensive and cannot be adopted in the real world, therefore, we have to skip this ‘perfect’ solution and look for a simpler and more practical method instead.

Transaction Collation
In order to obtain simple transaction ordering, we could create restrictions on the system:

First, we consider the state of the system, or ‘world state’, as the combination of the states for each account. The status of any two accounts is independent and does not affect another. The balance of one user account will not change due to the change of the balance of another account, and the data of one contract will not be affected by another contract.

Second, we limit each transaction to only change the state of one account. For example, within our scheme a transfer transaction, it either reduces the balance of one account or increases it. The balance of the two accounts is not allowed to be changed at the same time. In other words, transfer transactions are split into ‘sending transactions’ and ‘receiving transactions’. Similarly, transactions invoked by smart contracts are split into ‘request transactions’ and ‘response transactions’.

When you combine the two restrictions, the collation becomes simple — Transactions that affect the state of the same account must be ordered, plus a pair of ‘request’ and ‘response’ transactions must also comply, as the request transaction happens prior to the response transaction.

Block Lattice
Following the collation described in the previous section, in a DAG ledger, each account has an ordered transaction list, or has its own blockchain, called an ‘account chain’. In addition, different account chains may also establish links with each other via request-response transaction pairs. The DAG ledger having this structure is also known as ‘block lattice’.


Transactions are ordered in block-lattice. Regardless of the order in which the DAG ledger is traversed, the same world state can always be calculated by following the order of transactions recorded in the ledger, which is why Vite employs block-lattice as the ledger structure to build smart contracts on. Other DAG structures used by other blockchain projects (such as ‘Tangle’) are unfortunately not smart-contract friendly and additional transaction collation must be introduced to support smart contracts. This would be the equivalent to adding another DAG specifically for smart contracts outside of the original DAG ledger, which would be very challenging and complex to implement.

Summary
This article briefly describes a DAG ledger structure suitable for implementing smart contracts, and why it is adopted. For more detailed information please refer to Vite whitepaper at www.vite.org.
submitted by Bonita_V to ico [link] [comments]

The Strange Birth & History of Monero, Part III: Decentralized team

You can read here part I (by americanpegaus). This is the post that motivated me to make the part II. Now i'm doing a third part, and there'll be a final 4th part. This is probably too much but i wasn't able to make it shorter. Some will be interested in going through all them, and maybe someone is even willing to make a summary of the whole serie :D.
Monero - an anonymous coin based on CryptoNote technology
https://bitcointalk.org/index.php?topic=582080.0
Comentarios de interés:
-4: "No change, this is just a renaming. In the future, the binaries will have to be changed, as well as some URL, but that's all. By the way, this very account (monero) is shared by several user and is meant to make it easier to change the OP in case of vacancy of the OP. This idea of a shared OP comes from Karmacoin.
Some more things to come:
"
(https://bitcointalk.org/index.php?topic=582080.msg6362672#msg6362672)
-5: “Before this thread is too big, I would like to state that a bug has been identified in the emission curve and we are currently in the process of fixing it (me, TFT, and smooth). Currently coins are emitted at double the rate that was intended. We will correct this in the future, likely by bitshifting values of outputs before a certain height, and then correcting 1 min blocks to 2 min blocks. The changes proposed will be published to a Monero Improvement Protocol on github.”
(https://bitcointalk.org/index.php?topic=582080.msg6363016#msg6363016)
[tacotime make public the bug in the emission curve: token creation is currently 2 times what was intended to be, see this chart BTC vs the actual XMR curve, as it was and it is now, vs the curve that was initially planned in yellow see chart]
-14: “Moving discussion to more relevant thread, previous found here:
https://bitcointalk.org/index.php?topic=578192.msg6364026#msg6364026
I have to say that I am surprised that such an idea [halving current balances and then changing block target to 2 min with same block reward to solve the emission curve issue] is even being countenanced - there are several obvious arguments against it.
Perception - what kind of uproar would happen if this was tried on a more established coin? How can users be expected to trust a coin where it is perceived that the devs are able and willing to "dip" into people's wallets to solve problems?
Technically - people are trying to suggest that this will make no difference since it applies to reward and supply, which might be fair enough if the cap was halved also, but it isn't. People's holdings in the coin are being halved, however it is dressed up.
Market price - How can introducing uncertainty in the contents of people's wallets possibly help market price? I may well be making a fool of myself here, but I have never heard of such a fix before, unless you had savings in a Cypriot bank - has this ever been done for another coin?”
(https://bitcointalk.org/index.php?topic=582080.msg6364174#msg6364174)
-15: “You make good points but unfortunately conflicting statements were made and it isn't possible to stick to them all. It was said that this coin had a mining reward schedule similar to bitcoin. In fact it is twice as fast as intended, even even a bit more than twice as fast as bitcoin.
If you acquired your coins on the basis of the advertised reward schedule, you would be disappointed, and rightfully so, as more coins come to into existence more quickly than you were led to believe.
To simply ignore that aspect of the bug is highly problematic. Every solution may be highly problematic, but the one being proposed was agreed as being the least bad by most of the major stakeholders. Maybe it will still not work, this coin will collapse, and there will need to be a relaunch, in which case all your coins will likely be worthless. I hope that doesn't happen.”
(https://bitcointalk.org/index.php?topic=582080.msg6364242#msg6364242)
[smooth tries to justify his proposal to solve the emission curve issue: halve every current balance and change block target to 2 min with same block reward]
-16: “This coin wasn't working as advertised. It was supposed to be mined slowly like BTC but under the current emission schedule, 39% would be mined by the first year and 86% by the fourth year. Those targets have been moved out by a factor of 2, i.e. 86% mined by year 8, which is more like BTC's 75% by year 8. So the cap has been moved out much further into the future, constraining present and near-term supply, which is what determines the price.”
(https://bitcointalk.org/index.php?topic=582080.msg6364257#msg6364257)
[eizh supports smooth’s plan]
-20: “So long as the process is fair and transparent it makes no difference what the number is... n or n/2 is the same relative value so long as the /2 is applied to everyone. Correcting this now will avoid people accusing the coin of a favourable premine for people who mined in the first week.”
(https://bitcointalk.org/index.php?topic=582080.msg6364338#msg6364338)
[random user supporting smooth’s idea]
-21: “Why not a reduction in block reward of slightly more than half to bring it into line with the proposed graph? That would avoid all sorts of perceptual problems, would not upset present coin holders and be barely noticeable to future miners since less than one percent of coins have been mined so far, the alteration would be very small?”
(https://bitcointalk.org/index.php?topic=582080.msg6364348#msg6364348)
-22: “Because that still turns into a pre-mine or instamine where a few people got twice as many coins as everyone else in the first week.
This was always a bug, and should be treated as such.”
(https://bitcointalk.org/index.php?topic=582080.msg6364370#msg6364370)
[smooth wants to be sure they can’t be stigmatized as “premine”]
-23: “No, not true [answering to "it makes no difference what the number is... n or n/2 is the same relative value so long as the /2 is applied to everyone"]. Your share of the 18,000,000 coins is being halved - rightly or wrongly.”
(https://bitcointalk.org/index.php?topic=582080.msg6364382#msg6364382)
[good point made by a user that is battling “hard” with smooth and his proposal]
-28: “+1 for halving all coins in circulation. Would they completely disappear? What would the process be?”
-31: “I will wait for the next coin based on CryptoNote. Many people, including myself, avoided BMR because TFT released without accepting input from anyone (afaik). I pm'ed TFT 8 days before launch to help and didn't get response until after launch. Based on posting within the thread, I bet there were other people. Now the broken code gets "fixed" by taking away coins.”
(https://bitcointalk.org/index.php?topic=582080.msg6364531#msg6364531)
-32: “What you say is true, and I can't blame anyone from simply dropping this coin and wanting a complete fresh start instead. On the other hand, this coin is still gaining in popularity and is already getting close to bytecoin in hash rate, while avoiding its ninja premine. There is a lot done right here, and definitely a few mistakes.”
(https://bitcointalk.org/index.php?topic=582080.msg6364574#msg6364574)
[smooth stands for the project legitimacy despite the bugs]
-37: “Since everything is scaled and retroactive, the only person to be affected is... me. Tongue Because I bought BMR with BTC, priced it with incorrect information, and my share relative to the eventual maximum has been halved. Oh well. The rest merely mined coins that never should have been mined. The "taking away coins" isn't a symptom of the fix: it's the fundamental thing that needed fixing. The result is more egalitarian and follows the original intention. Software is always a work-in-progress. Waiting for something ideal at launch is pretty hopeless. edit: Let me point out that most top cryptocurrencies today were released before KGW and other new difficulty retargeting algorithms became widespread. Consequently they had massive instamines on the first day, even favorites in good standing like LTC. Here the early miners are voluntarily reducing their eventual stake for the sake of fairness. How cool is that?”
(https://bitcointalk.org/index.php?topic=582080.msg6364886#msg6364886)
[this is eizh supporting the project too]
-43: “I'm baffled that people are arguing about us making the emission schedule more fair. I'm an early adopter. This halves my money, and it's what I want to do. There's another change that needs to be talked about too: we don't believe that microscopic levels of inflation achieved at 9 or 10 years will secure a proof-of-work network. In fact, there's a vast amount of evidence from DogeCoin and InfiniteCoin that it will not. So, we'd like to fix reward when it goes between 0.25 - 1.00 coins. To do so, we need to further bitshift values to decrease the supply under 264-1 atomic units to accommodate this. Again, this hurts early adopters (like me), but is designed to ensure the correct operation of the chain in the long run. It's less than a week old, and if we're going to hardfork in economic changes that make sense we should do it now. We're real devs turning monero into the coin it should have been, and our active commitment should be nothing but good news. Fuck the pump and dumps, we're here to create something with value that people can use.”
(https://bitcointalk.org/index.php?topic=582080.msg6366134#msg6366134)
[tacotime brings to the public for first time the tail emission proposal and writes what is my favourite sentence of the whole monero history: “Fuck the pump and dumps, we're here to create something with value that people can use”]
-51: “I think this is the right attitude. Like you I stand to "lose" from this decision in having my early mining halved, but I welcome it. Given how scammy the average coin launch is, I think maximizing fairness for everyone is the right move. Combining a fair distribution with the innovation of Cryptonote tech could be what differentiates Monero from other coins.”
(https://bitcointalk.org/index.php?topic=582080.msg6366346#msg6366346)
-59: “Hello! It is very good that you've created this thread. I'm ok about renaming. But I can't agree with any protocol changes based only on decisions made by bitcointalk.org people. This is because not all miners are continiously reading forum. Any decision about protocol changes are to be made by hashpower-based voting. From my side I will agree on such a decision only if more than 50% of miners will agree. Without even such a simple majority from miners such changes are meaningless. In case of hardfork that isn't supported by majority of miners the network will split into two nets with low-power fork and high-power not-forking branches. I don't think that this will be good for anybody. Such a voting is easy to be implemented by setting minor_version of blocks to a specific value and counting decisions made after 1000 of blocks. Do you agree with such a procedure?”
(https://bitcointalk.org/index.php?topic=582080.msg6368478#msg6368478)
[TFT appears after a couple days of inactivity]
-63: “In few days I will publish a code with merged mining support. This code will be turned ON only by voting process from miners. What does it mean:
The same procedure is suitable for all other protocol changes.”
(https://bitcointalk.org/index.php?topic=582080.msg6368720#msg6368720)
[And now he is back, TFT is all about merged mining]
-67: “We don't agree that a reverse split amounts to "taking" coins. I also wouldn't agree that a regular forward split would be "giving" coins. It's an exchange of old coins with new coins, with very nearly the exact same value. There is a very slight difference in value due to the way the reward schedule is capped, but that won't be relevant for years or decades. Such a change is entirely reasonable to fix an error in a in coin that has only existed for a week.”
(https://bitcointalk.org/index.php?topic=582080.msg6368861#msg6368861)
-68: “There were no error made in this coin but now there is an initiative to make some changes. Changes are always bad and changes destroy participant confidence even in case these changes are looking as useful. We have to be very careful before making any changes in coins”
(https://bitcointalk.org/index.php?topic=582080.msg6368939#msg6368939)
[TFT does not accept the unexpected emission curve as a bug]
-72: “You are wrong TFT. The original announcement described the coin as having a reward curve "close to Bitcoin's original curve" (those are your exact words). The code as implemented has a reward curve that is nothing like bitcoin. It will be 86% mined in 4 years. It will be 98% mined in 8 years. Bitcoin is 50% mined in 4 years, and 75% in 8 years.
With respect TFT, you did the original fork, and you deserve credit for that. But this coin has now gone beyond your initial vision. It isn't just a question of whether miners are on bitcointalk or not.
There is a great team of people who are working hard to make this coin a success, and this team is collaborating regularly through forum posts, IRC, PM and email. And beyond that a community of users who by and large have been very supportive of the efforts we've taken to move this forward.
Also, miners aren't the only stakeholders, and while a miner voting process is great, it isn't the answer to every question. Though I do agree that miners need to be on board with any hard fork to avoid a harmful split.”
(https://bitcointalk.org/index.php?topic=582080.msg6369137#msg6369137)
[smooth breaks out publicily for first time against TFT]
-75: “I suppose that merged mining as a possible option is a good idea as soon as nobody is forced to use it. MM is a possibility to accept PoW calculated for some other network. It helps to increase a security of both networks and makes it possible for miners not to choose between two networks if they want both:
Important things to know about MM:
Actually the only change that goes with MM is that we are able to accept PoW from some other net with same hash-function. Each miner can decide his own other net he will merge mine BMR with.
And this is still very secure.
This way I don't see any disadvantage in merged mining. What disadvantages do you see in MM?”
(https://bitcointalk.org/index.php?topic=582080.msg6369255#msg6369255)
[TFT stands for merged mining]
-77: “Merged mining essentially forces people to merge both coins because that is the only economically rational decision. I do not want to support the ninja-premined coin with our hash rate.
Merged mining makes perfect sense for a coin with a very low hash rate, otherwise unable to secure itself effectively. That is the case with coins that merge mine with bitcoin. This coin already has 60% of the hash rate of bytecoin, and has no need to attach itself to another coin and encourage sharing of hash rate between the two. It stands well on its own and will likely eclipse bytecoin very soon.
I want people to make a clear choice between the fair launched coin and the ninja-premine that was already 80% mined before it was made public. Given such a choice I believe most will just choose this coin. Letting them choose both allows bytecoin to free ride on what we are doing here. Let the ninja-preminers go their own way.”
(https://bitcointalk.org/index.php?topic=582080.msg6369386#msg6369386)
[smooth again]
-85: “One of you is saying that there was no mistake in the emission formula, while the other is. I'm not asking which I should believe . . I'm asking for a way to verify this”
(https://bitcointalk.org/index.php?topic=582080.msg6369874#msg6369874)
[those that have not been paying attention to the soap opera since the beginning do not understand anything at all]
-86: “The quote I posted "close to Bitcoin's original curve" is from the original announcement here: https://bitcointalk.org/index.php?topic=563821.0
I think there was also some discussion on the thread about it being desirable to do that.
At one point in that discussion, I suggested increasing the denominator by a factor of 4, which is what ended up being done, but I also suggested retaining the block target at 2 minutes, which was not done. The effect of making one change without the other is to double the emission rate from something close to bitcoin to something much faster (see chart a few pages back on this thread).”
(https://bitcointalk.org/index.php?topic=582080.msg6369935#msg6369935)
[smooth answers just a few minutes later]
-92: “I'm happy the Bitmonero attracts so much interest.
I'm not happy that some people want to destroy it.
Here is a simple a clear statement about plans: https://bitcointalk.org/index.php?topic=582670
We have two kind of stakeholders we have respect: miders and coin owners.
Before any protocol changes we will ask miners for agreement. No changes without explicit agreement of miners is possible.
We will never take away or discount any coins that are already emitted. This is the way we respect coin owners.
All other issues can be discussed, proposed and voted for. I understand that there are other opinions. All decisions that aren't supported in this coin can be introduced in any new coin. It's ok to start a new fork. It's not ok to try to destroy an existsing network.”
(https://bitcointalk.org/index.php?topic=582080.msg6370324#msg6370324)
[TFT is kinda upset – he can see how the community is “somehow” taking over]
-94: “Sounds like there's probably going to be another fork then. Sigh.
I guess it will take a few tries to get this coin right.
The problem with not adjusting existing coins is that it make this a premine/instamine. If the emission schedule is changed but not as a bug fix, then earlier miners got an unfair advantage over everyone else. Certainly there are coins with premines and instamines, but there's a huge stigma and such a coin will never achieve the level of success we see for this coin. This was carefully discussed during the team meeting, which was announced a day ahead of time, and everyone with any visible involvement with the coin, you included, was invited. It is unfortunate you couldn't make it to that meeting TFT.”
(https://bitcointalk.org/index.php?topic=582080.msg6370411#msg6370411)
[smooth is desperate due to TFT lack of interest in collaboration, and he publicly speaks about an scission for first time]
-115: “Very rough website online, monero.cc (in case you asked, the domain name was voted on IRC, like the crypto name and its code). Webdesigner, webmaster, writers... wanted.”
(https://bitcointalk.org/index.php?topic=582080.msg6374702#msg6374702)
[Even though the lack of consensus and the obvious chaos, the community keeps going on: Monero already has his own site]
-152: “Here's one idea on fixing the emissions without adjusting coin balances.
We temporarily reduce the emission rate to half of the new target for as long as it takes for the total emission from 0 to match the new curve. Thus there will be a temporary period when mining is very slow, and during that period there was a premine.
But once that period is compete, from the perspective of new adopters, there was no premine -- the total amount of coins emitted is exactly what the slow curve says it should be (and the average rate since genesis is almost the same as the rate at which they are mining, for the first year or so at least).
This means the mining rewards will be very low for a while (if done now then roughly two weeks), and may not attract many new miners. However, I think there enough of us early adopters (and even some new adopters who are willing to make a temporary sacrifice) who want to see this coin succeed to carry it through this period.
The sooner this is done the shorter the catch up period needs to be.”
(https://bitcointalk.org/index.php?topic=582080.msg6378032#msg6378032)
[smooth makes a proposal to solve the “emission curve bug” without changing users balances and without favoring the early miners]
-182: “We have added a poll in the freenode IRC room "Poll #2: "Emission future of Monero, please vote!!" started by stickh3ad. Options: #1: "Keep emission like now"; #2: "Keep emission but change blocktime and final reward"; #3: "Keep emission but change blocktime"; #4: "Keep emission but change final reward"; #5: "Change emission"; #6: "Change emission and block time"; #7: "Change emission and block time and final reward"
Right now everyone is voting for #4, including me.”
(https://bitcointalk.org/index.php?topic=582080.msg6379518#msg6379518)
[tacotime announces an ongoing votation on IRC]
-184: “ change emission: need to bitshift old values on the network or double values after a certain block. controversial. not sure if necessary. can be difficult to implement. keep emission: straightforward, we don't keep change emission or block time. change final reward is simple. if (blockSubsidy < finalSubsidy) return finalSubsidy; else return blockSubsidy;”
(https://bitcointalk.org/index.php?topic=582080.msg6379562#msg6379562)
-188: “Yeah, well. We need to change the front page to reflect this if we can all agree on it.
We should post the emissions curve and the height and value that subsidy will be locked in to.
In my opinion this is the least disruptive thing we can do at the moment, and should ensure that the fork continues to be mineable and secure in about 8 years time without relying on fees to secure it (which I think you agree is a bad idea).”
(https://bitcointalk.org/index.php?topic=582080.msg6379871#msg6379871)
[tacotime]
-190: “I don't think the proposed reward curve is bad by any means. I do think it is bad to change the overall intent of a coin's structure and being close to bitcoins reward curve was a bit part of the intent of this coin. It was launched in response to the observation that bytecoin was 80% mined in less than two years (too fast) and also that it was ninja premined, with a stated goal that the new coin have a reward curve close to bitcoin.
At this point I'm pretty much willing to throw in the towel on this launch:
  1. No GUI
  2. No web site
  3. Botched reward curve (at least botched relative to stated intent)
  4. No pool (and people who are enthusiastically trying to mine having trouble getting any blocks; some of them have probably given up and moved on).
  5. No effective team behind it at launch
  6. No Mac binaries (I don't think this is all that big a deal, but its another nail)
I thought this could be fixed but with all the confusion and lack of clear direction or any consistent vision, now I'm not so sure.
I also believe that merged mining is basically a disaster for this coin, and is probably being quietly promoted by the ninjas holding 80% of bytecoin, because they know it keeps their coin from being left behind, and by virtue of first mover advantage, probably relegates any successors to effective irrelevance (like namecoin, etc.).
We can do better. It's probably time to just do better.”
(https://bitcointalk.org/index.php?topic=582080.msg6380065#msg6380065)
[smooth is disappointed]
-191: “The website does exist now, it's just not particularly informative yet. :) But, I agree that thankful_for_today has severely mislead everyone by stating the emission was "close to Bitcoin's" (if he's denying that /2 rather than /4 emission schedule was unintentional, as he seems to be). I'm also against BCN merge mining. It works against the goal of overtaking BCN and if that's not a goal, I don't know what we're even doing here. I'll dedicate my meagre mining to voting against that.
That said, you yourself have previously outlined why relaunches and further clones fail. I'd rather stick with this one and fix it.”
(https://bitcointalk.org/index.php?topic=582080.msg6380235#msg6380235)
[eizh tries to keep smooth on board]
-196: “BCN is still growing as well. It is up to 1.2 million now. If merged mining happens, (almost) everyone will just mine both. The difficulty on this coin will jump up to match BCN (in fact both will likely go higher since the hash rate will be combined) and again it is an instamine situation. (Those here the first week get the benefit of easy non-merged mining, everyone else does not.) Comments were made on this thread about this not being yet another pump-and-dump alt. I think that could have been the case, but sadly, I don't really believe that it is.”
(https://bitcointalk.org/index.php?topic=582080.msg6380778#msg6380778)
-198: “There's no point in fragmenting talent. If you don't think merge mining is a good idea, I'd prefer we just not add it to the code.
Bitcoin had no web site or GUI either initially. Bitcoin-QT was the third Bitcoin client.
If people want a pool, they can make one. There's no point in centralizing the network when it's just began, though. Surely you must feel this way.”
(https://bitcointalk.org/index.php?topic=582080.msg6381866#msg6381866)
[tacotime also wants smooth on board]
-201: “My personal opinion is that I will abandon the fork if merge mining is added. And then we can discuss a new fork. Until then I don't think Monero will be taken over by another fork.”
(https://bitcointalk.org/index.php?topic=582080.msg6381970#msg6381970)
[tacotime opens the season: if merged mining is implemented, he will leave the ship]
-203: “Ditto on this. If the intention wasn't to provide a clearweb launched alternative to BCN, then I don't see a reason for this fork to exist. BCN is competition and miners should make a choice.”
(https://bitcointalk.org/index.php?topic=582080.msg6382097#msg6382097)
[eizh supports tacotime]
-204: “+1 Even at the expense of how much I already "invested" in this coin.”
(https://bitcointalk.org/index.php?topic=582080.msg6382177#msg6382177)
[NoodleDoodle is also against merged mining]
This is basically everything worth reading in this thread. This thread was created in the wrong category, and its short life of about 2 days was pretty interesting. Merged mining was rejected and it ended up with the inactivity of TFT for +7 days and the creation of a new github repo the 30th of April. It is only 12 days since launch and a decentralized team is being built.
Basically the community had forked (but not the chain) and it was evolving and moving forward to its still unclear future.
These are the main takeaways of this thread:
  • The legitimacy of the "leaders" of the community is proven when they proposed and supported the idea of halving the balances for the greater good to solve the emission curve issue without any possible instamine accusation. Also their long-term goals and values rejecting merged-mining with a "primined scam"
  • It is decided that, as for now, it is “too late” to change the emission curve, and finally monero will mint 50% of its coin in ~1.3 years (bitcoin did it after 3.66 years) and 86% of its coins in 4 years (bitcoin does it in ~11 years) (was also voted here) (see also this chart)
  • It is decided that a “minimum subsidy” or “tail emission” to incentivize miners “forever” and avoid scaling fees will be added (it will be finally added to the code march 2015)
  • Merged mining is plainly rejected by the future “core team” and soon rejected by "everyone". This will trigger TFT inactivity.
  • The future “core team” is somehow being formed in a decentralized way: tacotime, eizh, NoodleDoodle, smooth and many others
And the most important. All this (and what is coming soon) is a proof of the decentralization of Monero. Probably comparable to Bitcoin first days. This is not a company building a for-profit project (even if on the paper it is not for-profit), this a group of disconnected individuals sharing a goal and working together to reach it.
Soon will be following a final part where i'll collect the bitcointalk logs in the current official announcement threads. There you'll be able to follow the decentralized first steps of develoment (open source pool, miner optimizations and exchanges, all surrounded by fud trolls, lots of excitmen and a rapidly growing collaborative community.
submitted by el_hispano to Monero [link] [comments]

Bitcoin Halving Explained Simple - Does it Affect Bitcoin ... Bitcoin price graph put through a sound generator What is Crypto Mining Difficulty and How it Impacts YOUR Profits - Explained W/ BTC ZenCash ZEC Crypto Mining Difficulty 101 - Everything You Need to Know Best Bitcoin For Mac

What exactly is the difficulty ribbon? The ribbon charts move the averages of the mining activity. Thus, it is possible to see the change in the difficulty of the bitcoin mining. Woo also explains that the bitcoin mining itself affects the price of the cryptocurrency. Bitcoin can be used to pay online and in physical stores just like any other form of money. Bitcoins can also be exchanged in physical form such as the Denarium coins, but paying with a mobile phone usually remains more convenient. Bitcoin balances are stored in a large distributed network, and they cannot be fraudulently altered by anybody. Every electronic currency, particularly the bitcoin, has a complexity indicator. A number of coins mined per time unit directly depends from it. Than it is higher, the less BTC will be found. Let’s try to understand the complexity graph of the mining. Bitcoin Cash, Bitcoin and cryptocurrency markets, price data, charts and news. Additionally, Bitcoin has been looked into more than the stock markets, and even more than Taylor Swift. The graph also shows that Bitcoin searches exceeded Donald Trump searches back in December 2017. This info becomes much more valuable if we remember that this was a time when Bitcoin’s price hit its $20,000 mark for the first time in BTC

[index] [3716] [21939] [4288] [32395] [8498] [31273] [24455] [30290] [11189] [16983]

Bitcoin Halving Explained Simple - Does it Affect Bitcoin ...

Start trading Bitcoin and cryptocurrency here: http://bit.ly/2Vptr2X Every 4 years on average (210K blocks) the reward granted to Bitcoin miners for adding a... In this video, I attempt to describe how crypto mining difficulty works and how it affects profitability. I also crunch some numbers to show alternative methods for determining profits based on ... http://bitcoinpoet.com Bitcoin is a software-based payment system described by Satoshi Nakamoto in 2008 and introduced as open-source software in 2009. Payme... Bitcoin price graph put through a sound generator Cosmin Roşu. ... [Bitcoin BTC Technical Analysis Price Prediction 2020] ... (After Effects) - Duration: 15:36. aovdesign 201,087 views. Bitcoin basics: What is the difficulty target and how does it adjust itself? - Duration: 7:12. Keifer Kif 4,486 views. 7:12. What is Crypto Mining Difficulty and How it Impacts YOUR Profits ...

#