Mining Bitcoins using a Heterogeneous Computer Architecture
Mining Bitcoins using a Heterogeneous Computer Architecture
How one can Mine Bitcoin: All of the items You Must Know ...
How Renewable Energy Source Can Help Bitcoin Mining ...
Bitcoin Hardware & Mining
[OWL WATCH] Waiting for "IOTA TIME" 27;
Disclaimer: This is my editing, so there could be always some misunderstandings and exaggerations, plus many convos are from 'spec channel', so take it with a grain of salt, pls. + I added some recent convos afterward. -------------------------------------------------- 📷 Luigi Vigneri [IF]어제 오후 8:26 Giving the opportunity to everybody to set up/run nodes is one of IOTA's priority. A minimum amount of resources is obviously required to prevent easy attacks, but we are making sure that being active part of the IOTA network can be possible without crazy investments. we are building our solution in such a way that the protocol is fair and lightweight. 📷 Hans Moog [IF]어제 오후 11:24 IOTA is not "free to use" but it's - fee-less you have tokens? you can send them around for free 📷 Hans Moog [IF]어제 오후 11:25 you have no tokens? you have to pay to use the network 📷 lekanovic어제 오후 11:25 I think it is a smart way to avoid the spamming network problem 📷 Hans Moog [IF]어제 오후 11:26 owning tokens is essentially like owning a share of the actual network and the throughput it can process 📷 Hans Moog [IF]어제 오후 11:26**** if you don't need all of that yourself, you can rent it out to people and earn money 📷 Hans Moog [IF]어제 오후 11:27 mana = tokens * time since you own them simplified 📷 Hans Moog [IF]어제 오후 11:27 the longer you hold your tokens and the more you have, the more mana you have but every now and then you have to move them to "realize" that mana 📷 lekanovic어제 오후 11:28 Is there any other project that is using a Mana solution to the network fee problem ? 📷 Hans Moog [IF]어제 오후 11:28 nah the problem with current protocol is that they are leader based 📷 Hans Moog [IF]어제 오후 11:29 you need absolute consensus on who the current leaders are and what their influence in the network is that's how blockchains works 📷 Hans Moog [IF]어제 오후 11:29 if two block producers produce 2 blocks at the same time, then you have to choose which one wins and where everybody attaches their next block to IOTA works differently and doesn't need to choose a single leader we therefore have a much bigger flexibility of designing our sybil protection mechanisms in a way, mana is also supposed to solve the problem of "rewarding" the infrastructure instead of the validators in blockchain only the miners get all the money running a node and even if it's one that is used by a lot of people will only cost you won't get anything back no fees, nothing the miners get it all 📷 Hans Moog [IF]어제 오후 11:31 in IOTA, the node operators receive the mana which gives them a share of the network throughput 📷 Hans Moog [IF]어제 오후 11:32 because in blockchain you need to decide whose txs become part of the blocks and it's not really based on networking protocols like AIMD 📷 lekanovic어제 오후 11:33 And the more Mana your node have, the more trust your node has and you have more to say in the FPC, is that correct? 📷 Hans Moog [IF]어제 오후 11:33 yeah a node that has processed a lot of txs of its users will have more mana than other nodes and therefore a bigger say in deciding conflicts its a direct measure of "trust" by its users 📷 lekanovic어제 오후 11:34 And choosing committee for dRNG would be done on L1 protocol level? Everything regarding Mana will be L1 level, right? 📷 Hans Moog [IF]어제 오후 11:35 Yeah Mana is layer1, but will also be used as weight in L2 solutions like smart contracts 📷 lekanovic어제 오후 11:35 And you are not dependant on using SC to implement this 📷 Hans Moog [IF]어제 오후 11:35 No, you don't need smart contracts That's all the base layer 📷 Hans Moog [IF]어제 오후 11:37 'Time' actually takes into account things like decay So it doesn't just increase forever It's close to "Demurrage" in monetary theory 📷 lekanovic어제 오후 11:36 For projects to be able to connect to Polkadot or Cosmos, you need to get the state of the ledger. Will it be possible to get the Tangle state? If this would be possible, then I think it would be SUPER good for IOTA 📷 Hans Moog [IF]어제 오후 11:38 Yeah but polkadot is not connecting other dlts Just inhouse stuff 📷 Hyperware어제 오후 11:39 Is there still a cap on mana so that the rich don't get richer? 📷 Hans Moog [IF]어제 오후 11:39 Yes mana is capped 📷 TangleAccountant어제 오후 11:39 u/HansMoog [IF] My first thought is thatthe evolution of this renting system will lead to several big mana renting companies that pool together tons of token holders mana. That way businesses looking to rent mana just need to deal with a reliable mana renting company for years instead of a new individualevery couple of months (because life happens and you don't know if that individual will need to sell their IOTAs due to personal reasons). Any thoughts on this? 📷 Hans Moog [IF]어제 오후 11:41 u/TangleAccountantyes that is likely - but also not a bad thing - token holders will have a place to get their monthly payout and the companies that want to use the tangle without having tokens have a place to pay 📷 TangleAccountant어제 오후 11:42 Oh I completely agree.That's really cool. I'll take a stab at creating one of those companies in the US. 📷 Hans Moog [IF]어제 오후 11:42 And everybody who wants to run a node themselves or has tokens and wants use the tangle for free can do so But "leachers" that would want to use the network for free won't be able to do so I mean ultimately there will always be "fees", as there is no "free lunch". You have a certain amount of resources that a network can process and you have a certain demand. And that will naturally result in fees based on supply / demand what you can do however is to build a system where the actual users of that system that legitimately want to use it can do so for free, just because they already "invest" enough by having tokens or running infrastructure they are already contributing to the well-being of the network through these two aspects alone it would be stupid to ask those guys for additional fees and mana essentially tries to be such a measure of honesty among the users 📷 Hyperware어제 오후 11:47 It's interesting from an investment perspective that having tokens/mana is like owning a portion of the network. 📷 Hans Moog [IF]어제 오후 11:48 Yeah, you are owning a certain % of the throughput and whatever the price will ultimately be to execute on this network - you will earn proportionally but you have to keep in mind that we are trying to build the most efficient DLT that you could possibly ever build 📷 semibaron어제 오후 11:48 The whole mana (tokens) = share of network throuput sounds very much like EOS tbh Just that EOS uses DPoS 📷 Hans Moog [IF]어제 오후 11:50 yeah i mean there is really not too many new things under the sun - you can just tweak a few things here and there, when it comes to distributing resources DPoS is simply not very nice from a centralization aspect 📷 Hans Moog [IF]어제 오후 11:50 at least not the way EOS does it delegating weights is 1 thing but assuming that the weight will always be in a way that 21 "identities" run the whole network is bad in the current world you see a centralization of power but ultimately we want to build a future where the wealth is more evenly distributed and the same goes for voting power 📷 Hans Moog [IF]어제 오후 11:52 blockchain needs leader selection it only works with such a centralizing component IOTA doesn't need that it's delusional to say that IOTA wouldn't have any such centralization but maybe we get better than just a handselected nodes📷 📷 Phantom3D어제 오후 11:52 How would this affect a regular hodler without a node. Should i keep my tokens elsewere to generate mana and put the tokens to use? 📷 Hans Moog [IF]어제 오후 11:53 you can do whatever you want with your mana just make an account at a node you regularly use and use it to build up a reputation with that node to be able to use your funds for free or run a node yourself or rent it out to companies if you just hodl 📷 semibaron어제 오후 11:54 Will there be a build-in function into the node software / wallet to delegate ("sell") my mana? 📷 Hans Moog [IF]어제 오후 11:55 u/semibaronnot from the start - that would happen on a 2nd layer ------------------------------------------------------------------------------------------------------------ 📷 dom어제 오후 9:49
suddenly be incentive to hold iota?
to generate Mana 📷 Hyperware오늘 오전 4:21 The only thing I can really do, is believe that the IF have smart answers and are still building the best solutions they can for the sake of the vision 📷 dom오늘 오전 4:43 100% - which is why we're spending so much effort to communicate it more clearly now we'll do an AMA on this topic very soon 📷 M [s2]오늘 오전 4:54 u/dom please accept my question for the AMA: will IOTA remain a permissionless system and if so, how? 📷 dom오늘 오전 4:57 of course it remains permissionless 📷 dom오늘 오전 5:20 what is permissioned about it? is ETH or Bitcoin permissioned because you have to pay a transaction fee in their native token? 📷 Gerrit오늘 오전 5:24 How did your industry partners think about the mana solution and the fact they need to hold the token to ensure network throughput? 📷 dom오늘 오전 5:26 u/Gerritconsidering how the infrastructure, legal and regulatory frameworks are improving around the adoption and usage of crypto-currencies within large companies, I really think that we are introducing this concept exactly at the right time. It should make enterprise partners comfortable in using the permissionless network without much of a hurdle.They can always launch their own network if they want to ... 📷 Gerrit오늘 오전 5:27 Launching their own network can’t be what you want 📷 dom오늘 오전 5:27 exactly but that is what's happening with Ethereum and all the other networks they don't hold Ether tokens either. 📷 Gerrit오늘 오전 5:32 Will be very exciting to see if ongoing regulation will „allow“ companies to invest and hold the tokens. With upcoming custody solutions that would be a fantastic play. 📷 Hans Moog [IF]오늘 오전 5:34 It's still possible to send transactions even without mana - mana is only used in times of congestion to give the people that have more mana more priority there will still be sharding to keep the network free most of the time 📷 Hans Moog [IF]오늘 오전 5:35 but without a protection mechanism, somebody could just spam a lot of bullshit and you could break the network(수정됨) you need some form of protection from this 📷 M [s2]오늘 오전 5:36 u/HansMoog [IF]so when I have 0 Mana, I can still send transactions? This is actually the point where it got strange... 📷 Hans Moog [IF]오늘 오전 5:37 yes you can unless the network is close to its processing capabilities / being attacked by spammers then the nodes will favor the mana holders 📷 Hans Moog [IF]오늘 오전 5:37 but having mana is not a requirement for many years to come currently even people having fpgas can't spam that many tps and we will also have sharding implemented by then 📷 M [s2]오늘 오전 5:39 Thank youu/HansMoog [IF] ! This is the actually important piece of info! 📷 Basha오늘 오전 5:38 ok, i thought it was communicated that you need at least 1 mana to process a transaction. from the blogpost: "... a node with 0 mana can issue no transactions." maybe they meant during the congestion**, but if that's the case maybe you should add that** 📷 Hans Moog [IF]오늘 오전 5:42 its under the point "Congestion control:" yeah this only applies to spam attacks network not overloaded = no mana needed 📷 Hans Moog [IF]오늘 오전 5:43 if congested => favor txs from people who have the most skin in the game but sharding will try to keep the network non-congested most of the time - but there might be short periods of time where an attacker might bring the network close to its limits and of course its going to take a while to add this, so we need a protection mechanism till sharding is supported(수정됨) 📷 Hans Moog [IF]오늘 오전 6:36 I don't have a particular problem with EOS or their amount of validators - the reason why I think blockchain is inferior has really nothing to do with the way you do sybil protection and with validators I mean "voting nodes" I mean even bitcoin has less mining pools and you could compare mining pools to dpos in some sense where people assign their weight (in that case hashing power) to the corresponding mining pools so EOS is definitely not less decentralized than any other tech but having more identities having weight in the decision process definitely makes it harder to corrupt a reasonable fraction of the system and makes it easier to shard so its desirable to have this property(수정됨) ------------------------------------------------- 📷 Antonio Nardella [IF]오늘 오전 3:36
u/C3PO[92% Cooless]They could also add more git repos instead of the wallet one, and we would probably be #1 there too.. ---------------------------------------------------------------------------------- Disclaimer: I'm sorry, maybe I'm fueling some confusion through posting this mana-thing too soon, but, instead of erasing this posting, I'm adding recent convos. Certain things about mana seem to be not clear, yet. It would be better to wait for some official clarification. But, I hope the community gives its full support to IF, 'cause there could be always some bumps along the untouched, unchartered way. -------------------------------------------------------------------------------------- Recent Addition;
Billy Sanders [IF]오늘 오후 1:36
It's still possible to send transactions even without mana - mana is only used in times of congestion to give the people that have more mana more priority
u/HansMoog [IF] Im sorry Hans, but this is false in the current congestion control algorithm. No mana = no transactions. To be honest, we havent really tried to make it work so that you can sent transactions with no mana during ties with no congestion, but I dont see how you can enable this and still maintain the sybil protection required. u/LuigiVigneri [IF] What do you think?📷
Dave [EF]오늘 오후 2:19
Suggestion: Sidebar, then get back to us with the verdict.(수정됨)📷2📷
dom오늘 오후 2:27
No Mana no tx will definitely not be the case(수정됨)📷5📷7***[오후 2:28]***Billy probably means the previous rate control paper as it was written by Luigi. I'll clarify with them📷
Hans Moog [IF]오늘 오후 2:29
When was this decided u/BillySanders [IF] and by whom? Was this discussed at last resum when I wasnt there? The last info that I had was that the congestion control should only kick in when there is congestion?!?***[오후 2:29]***📷 📷 📷📷
Navin Ramachandran [IF]오늘 오후 2:30
Let's sidebar this discussion and return when we have agreement. Dave has the right idea
One of the little-known aspects of bitcoin is the nature of the proof of work system. There are many people, especially those who support a UASF or PoW change that believe a distributed system should be completed as a mesh. In this, they confuse centralised systems with centrality. The truth of the matter, no matter which proof of work system is implemented, they all follow a maximal growth curve that reflects the nature of the firm as detailed in 1937 by Ronald Coase (1937). The bitcoin White Paper was very specific. users of the system "vote with their CPU power" . What this means, is that the system was never generated to give one vote per person. It is designed purely around economic incentives individuals with more hash power will have provided more investment into the system. These individuals who invest more in the system gain more say in the system. At the same time, no one or even two individuals can gain complete control of the system. We'll explore the nature of cartels in a separately, but these always fail without government intervention. The reason for cartels failing comes down to the simple incentivisation of the most efficient member. The strongest cartel member always ends up propping up the weakest. This leads to a strategy of defection. No proof of work-based solution ever allows for a scenario where you have one vote to one person. The anti-sybiling functions of bitcoin and all other related systems based on proof of work or similar derivatives are derived from an investment based strategy. Solutions to the implementation of ASIC based systems are constantly proposed as a methodology of limiting the centralisation of proof of work systems as it is termed. The truth of the matter is that the mining function within any proof of work system naturally aligns to business interests. This leads to corporations running machines within data centres. On the way that democracies and republics have migrated away from small groups of people individually voting for an outcome towards a vote for a party, the transactional costs associated with individual choice naturally leads to corporate solutions. In this, the corporation mirrors a political party. In this paper, we address the issues of using alternate approval work systems with regards to either incorporating alternate functions in an extension of simply securing the network against the use of proof of work systems to create a one person one vote scenario in place of economic incentivisation. We will demonstrate conclusively that all systems migrate to a state of economic efficiency. The consequence of this is that systems form into groups designed to maximise returns. The effect is that bitcoin is not only incentive compatible but is optimal. No system can efficiently collapse into an order of one vote one individual and remain secure. In the firm-based nature of bitcoin, we demonstrate that the inherent nature of the firm is reflected within mining pools. Multiple aggregation strategies exist. The strategies range from the creation of collective firms where members can easily join or leave (mining pools) through to more standard corporate structures Proof of Work as it relates to the theory of the firm. that are successful within any proof of work system. The system was determined to be based on one- vote per CPU (Satoshi, 2008) and not one vote per person or one vote per IP address. The reasons for this is simple, there is no methodology available that can solve byzantine consensus on an individual basis. The solution developed within bitcoin solves this economically using investment. The parties signal their intent to remain bound to the protocol through a significant investment. Those parties that follow the protocol are rewarded. The alternative strategy takes us back to the former and failed systems such as e-cash that could not adequately solve Sybil attacks and decentralise the network. Bitcoin manages to maintain the decentralise nature of the network through a requirement that no individual party can ever achieve more than 50% of the network hash rate. In all proof of work systems, there are requirements to inject a costly signal into the network that is designed as the security control. To many people, they believe that the cryptographic element, namely the hashing process is the security feature of bitcoin. This is a fallacy, it is the economic cost that is relevant to the overall system and not the individual element. The benefits of a hash function are that they are difficult to solve in the nature of the proof of work algorithm but are easy to verify. This economic asymmetry is one of the key features of bitcoin. Once a user has found a solution, they know it can be quickly broadcast and verified by others. Additionally, the hash algorithm provides a fair distribution system based on the amount of invested hash rate. The distinction from proof of stake solution as has been proposed comes in the requirement to constantly reinvest. A proof of stake system requires a single investment. Once this investment is created, the system is incentivised towards the protection of the earlier investment. This leads to a scenario known as a strategic oligopoly game. The solution using a proof of work algorithm is the introduction of an ongoing investment. This is different to an oligopoly game in that sunk cost cannot make up for continued investment. In a proof of stake system, prior investment is crystallised allowing continued control with little further investment. Proof of work differs in that it requires continuous investment. More than this, it requires innovation. As with all capitalist systems, they are subject to Schumpeterian dynamical change (Shumpeter, 1994). The system of creative destruction allows for cycles of innovation. Each innovation leads to waves of creation over the destruction of the old order. This process creates continued growth. Proof of work-based systems continue to grow and continue to update and change. Any incumbent corporation or other entity needs to continue to invest knowing that their continued dominance is not assured. In bitcoin, we have seen innovative leaps as people moved from CPU-based mining into GPU-based systems. This initial innovation altered the software structure associated with the mining process in bitcoin. That change significantly altered the playing field leading to novel techniques associated with FPGAs and later ASICs dedicated to a specific part of the mining process. The error held by many people is that this move from a CPU-based solution into more costly implementations could have been averted. A consequence of this has been the introduction of alternative proof of work systems into many of the alt-coins These systems have been implemented without the understanding that it is not the use of ASICs that is an issue. It is that the belief that individual users can individually mine in a mesh system will be able to be implemented as a successful proof of work. In the unlikely event that a specialised algorithm was implemented that could only run once on any one machine CPU, it would still lead to the eventual creation of corporate data centres for mining. In the section above, we showed using Arrow’s theorem how only a single use proof of work system can be effective. If we extend this and look at the Theory of the Firm (Coase, 1937) we note that in a system in Litecoin and Dogecoin for example. A00137: Proof of Work as it relates to the theory of the firm. of prices, reduction could be carried out without any organisation. One issue against this arises from the cost of information. Interestingly, as we move into a world of increasingly more information, it becomes scarce information that is important. As the amount of information becomes more voluminous, the ability to uncover accurate and timely information becomes scarcer. The ability to specialise in the coordination of the various factors of production and the distribution of information leads towards vertical integration within firms. We see this first voiced in Adam Smith’s (Smith, 1776) postulation on the firm: Everyone can choose to either seek further information or act on the information that they already have. This information can be in the form of market knowledge, product knowledge, or expertise, but at some point, the individual needs to decide to act. There is a cost to obtaining information. The returns on obtaining more information hit a maximum level and start to decrease at a certain point. The entrepreneur acts as a guiding influence managing the risk associated with incomplete information compared to the risk of not acting but rather waiting to obtain more information. In the instance of bitcoin mining, the firm can increase in size through the integration of multiple specialist roles. Even given the assumption that any one process can run on but a single CPU, we come to the scenario of high-end datacentre servers. The Intel Xeon Phi 7290f implements 72 Atom CPU Cores. Each core runs two threads. Even taking the control system into account, this leaves 142 processes able to run per system. With four cards per RU this allows for datacentre implementations of 5,964 mining processes to run on a pure CPU-based proof of work implementation. One person can manage a small number of mining server implementations within a home or small business environment. In large data centre-based organisations such as Facebook, a single administrator can run 20,000 servers The effect of this would be one individual managing 2,840,000 individual CPU-based mining processes. This alone is outside the scaling capabilities of any individual. This can be further enhanced as cost savings through the creation of large data centres, management savings and integrating multiple network and systems administrators is considered. As we start to add additional layers we come to a maximum where it is no longer profitable to grow the firm in size. Right up until that point, the firm will grow.
Transcript of Open Developer Meeting in Discord - 7/19/2019
[Dev-Happy] BlondfrogsLast Friday at 3:58 PM Hey everyone. The channel is now open for the dev meeting. LSJI07 - MBITLast Friday at 3:58 PM Hi TronLast Friday at 3:59 PM Hi all! JerozLast Friday at 3:59 PM :wave: TronLast Friday at 3:59 PM Topics: Algo stuff - x22rc, Ownership token for Restricted Assets and Assets. JerozLast Friday at 4:00 PM @Milo is also here from coinrequest. MiloLast Friday at 4:00 PM Hi :thumbsup: Pho3nix Monk3yLast Friday at 4:00 PM welcome, @Milo TronLast Friday at 4:00 PM Great. @Milo Was there PRs for Android and iOS? MiloLast Friday at 4:01 PM Yes, I've made a video. Give me a second I'll share it asap. JerozLast Friday at 4:02 PM I missed the iOS one. MiloLast Friday at 4:02 PM Well its 1 video, but meant for all. JerozLast Friday at 4:02 PM Ah, there's an issue but no pull request (yet?) https://github.com/RavenProject/ravenwallet-ios/issues/115 [Dev-Happy] BlondfrogsLast Friday at 4:03 PM nice @Milo MiloLast Friday at 4:04 PM Can it be that I have no video post rights? JerozLast Friday at 4:05 PM In discord? MiloLast Friday at 4:05 PM yes? [Dev-Happy] BlondfrogsLast Friday at 4:05 PM just a link? JerozLast Friday at 4:05 PM Standard version has a file limit afaik Pho3nix Monk3yLast Friday at 4:05 PM try now gave permissions MiloLast Friday at 4:05 PM it's not published yet on Youtube, since I didn't knew when it would be published in the wallets file too big. Hold on i'll put it on youtube and set it on private LSJI07 - MBITLast Friday at 4:06 PM no worries ipfs it...:yum: Pho3nix Monk3yLast Friday at 4:06 PM ok, just send link when you can [Dev-Happy] BlondfrogsLast Friday at 4:07 PM So guys. We released Ravencoin v2.4.0! JerozLast Friday at 4:08 PM If you like the code. Go update them nodes! :smiley: [Dev-Happy] BlondfrogsLast Friday at 4:08 PM We are recommending that you are upgrading to it. It fixes a couple bugs in the code base inherited from bitcoin! MiloLast Friday at 4:08 PM https://www.youtube.com/watch?v=t\_g7NpFXm6g&feature=youtu.be sorry for the hold up YouTube Coin Request Raven dev Gemiddeld LSJI07 - MBITLast Friday at 4:09 PM thanks short and sweet!! KAwARLast Friday at 4:10 PM Is coin request live on the android wallet? TronLast Friday at 4:10 PM Nice video. It isn't in the Play Store yet. Pho3nix Monk3yLast Friday at 4:10 PM Well, this is the first time in a while where we have this many devs online. What questions do y'all have? LSJI07 - MBITLast Friday at 4:11 PM Algo questions? Pho3nix Monk3yLast Friday at 4:11 PM sure KAwARLast Friday at 4:11 PM KK LSJI07 - MBITLast Friday at 4:12 PM what are the proposed 22 algos in x22r? i could only find the original 16 plus 5 on x21. TronLast Friday at 4:12 PM Likely the 5 from x21 and find one more. We need to make sure they're all similar in time profile. liqdmetalLast Friday at 4:14 PM should we bother fixing a asic-problem that we dont know exists for sure or not? TronLast Friday at 4:14 PM That's the 170 million dollar question. [Dev-Happy] BlondfrogsLast Friday at 4:14 PM I would prefer to be proactive not reactive. imo JerozLast Friday at 4:14 PM same LSJI07 - MBITLast Friday at 4:15 PM RIPEMD160 is a golden oldie but not sure on hash speed compared to the others. liqdmetalLast Friday at 4:15 PM in my mind we should focus on the restricted messaging etc Sevvy (y rvn pmp?)Last Friday at 4:15 PM probably won't know if the action was needed until after you take the action liqdmetalLast Friday at 4:15 PM we are at risk of being interventionistas acting under opacity TronLast Friday at 4:15 PM Needs to spit out at least 256 bit. Preferably 512 bit. LSJI07 - MBITLast Friday at 4:15 PM ok TronLast Friday at 4:15 PM If it isn't 512 bit, it'll cause some extra headache for the GPU mining software. liqdmetalLast Friday at 4:16 PM i seek to avoid iatrogenics TronLast Friday at 4:16 PM Similar to the early problems when all the algos except the first one were built for 64-bytes (512-bit) inputs. Had to look that one up. TIL iatrogenics JerozLast Friday at 4:17 PM I have to google most of @liqdmetal's vocabulary :smile: liqdmetalLast Friday at 4:17 PM @Tron tldr: basically the unseen, unintended negative side effects of the asic "cure" Sevvy (y rvn pmp?)Last Friday at 4:18 PM 10 dolla word liqdmetalLast Friday at 4:19 PM we need a really strong case to intervene in what has been created. TronLast Friday at 4:19 PM I agree. I'm less concerned with the technical risk than I am the potential split risk experienced multiple times by Monero. Sevvy (y rvn pmp?)Last Friday at 4:20 PM tron do you agree that forking the ravencoin chain presents unique risks compared to other chains that aren't hosting assets? JerozLast Friday at 4:21 PM Yes, if you fork, you need to figure out for each asset which one you want to support. Sevvy (y rvn pmp?)Last Friday at 4:21 PM yeah. and the asset issuer could have a chain preference TronLast Friday at 4:22 PM @Sevvy (y rvn pmp?) Sure. Although, I'd expect that the asset issuers will be honor the assets on the dominant chain. Bigger concern is the branding confusion of multiple forks. See Bitcoin, Bitcoin Cash, Bitcoin SV for an example. We know they're different, but do non-crypto folks? Hans_SchmidtLast Friday at 4:22 PM I thought that the take-away from the recently published analyses and discussions was that ASICs for RVN may be active, but if so then they are being not much more effective than GPUs. Sevvy (y rvn pmp?)Last Friday at 4:22 PM agreed on all accounts there tron TronLast Friday at 4:23 PM I'm not yet convinced ASICs are on the network. KAwARLast Friday at 4:23 PM It would be better to damage an asic builder by forking after they made major expenses. Creating for them the type of deficit that could be negated by just buying instead of mining. Asic existence should be 100 percent confirmed before fork. liqdmetalLast Friday at 4:23 PM 170million dollar question is right.lol TronLast Friday at 4:24 PM I've had someone offer to connect me to the folks at Fusion Silicon. Sevvy (y rvn pmp?)Last Friday at 4:25 PM yes. and if they are active on the network they are not particularly good ASICs which makes it a moot point probably TronLast Friday at 4:26 PM The difficult part of this problem is that by the time everyone agrees that ASICs are problematic on the network, then voting the option in is likely no longer an option. Sevvy (y rvn pmp?)Last Friday at 4:26 PM yes. part of me wonders if we would say "okay, the clock on the asic countdown is reset by this new algo. but now the race is on" [Dev-Happy] BlondfrogsLast Friday at 4:26 PM There are always risks when making a change that will fork the network. We want wait to long though, as tron said. It wont be a voting change. it will be a mandatory change at a block number. Sevvy (y rvn pmp?)Last Friday at 4:26 PM acknowledge the inevitable MiloLast Friday at 4:27 PM I had just a small question from my side. When do you think the android version would be published, and do you maybe have a time-frame for the others? TronLast Friday at 4:27 PM Quick poll. How would everyone here feel about a BIP9 option - separate from the new features that can be voted in? KAwARLast Friday at 4:27 PM Maybe voting should not be a strictly blockchain vote. A republic and a democratic voice? [Dev-Happy] BlondfrogsLast Friday at 4:27 PM @Milo We can try and get a beta out next week, and publish soon after that. MiloLast Friday at 4:28 PM @[Dev-Happy] Blondfrogs :thumbsup::slight_smile: [Dev-Happy] BlondfrogsLast Friday at 4:28 PM BIP9 preemptive vote. I like it. TronLast Friday at 4:30 PM The advantage to a BIP9 vote is that it puts the miners and mining pools at a clear majority before activation. LSJI07 - MBITLast Friday at 4:30 PM Centralisation is inevitable unless we decide to resist it. ASIC's are market based and know the risks and rewards possible. A key step in resisting is sending a message. An algo change to increase asic resistance is imho a strong message. A BIP9 vote now would also be an indicator of bad actors early.... TronLast Friday at 4:30 PM The disadvantage is that it may not pass if the will isn't there. LSJI07 - MBITLast Friday at 4:30 PM Before assets are on main net and cause additional issues. KAwARLast Friday at 4:31 PM I am not schooled in coding to have an educated voice. I only understand social problems and how it affects the economy. SpyderDevLast Friday at 4:31 PM All are equal on RVN TronLast Friday at 4:31 PM It is primarily a social problem. The tech change is less risky and is easier than the social. LSJI07 - MBITLast Friday at 4:32 PM All can have a share....people who want more of a share however pay for the privilege and associated risks. KAwARLast Friday at 4:33 PM Assets and exchange listings need to be consistent and secure. brutoidLast Friday at 4:36 PM I'm still not entirely clear on what the overall goal to the algo change is? Is it just to brick the supposed ASICs (unknown 45%) which could still be FPGAs as seen from the recent block analysis posted in the nest. Is the goal to never let ASICs on? Is it to brick FPGAs ultimately. Are we making Raven strictly GPU only? I'm still unclear LSJI07 - MBITLast Friday at 4:37 PM What about the future issue of ASICs returning after a BIP9 fork "soon"? Are all following the WP as a community? i.e asic resistant or are we prepared to change that to asic resistant for early coin emission. Ideally we should plan for the future. Could the community make a statement that no future algo changes will be required to incentivise future public asic manufacturers? Lol. Same question @brutoid brutoidLast Friday at 4:37 PM Haha it is You mind-beamed me! [Dev-Happy] BlondfrogsLast Friday at 4:38 PM The is up to the community. Currently, the feel seems like the community is anti asic forever. The main issue is getting people to upgrade. KAwARLast Friday at 4:38 PM Clarity is important. Otherwise we are attacking windmills like Don Quixote. brutoidLast Friday at 4:39 PM I'm not getting the feeling of community ASIC hate if the last few weeks of discussion are anything to go by? Hans_SchmidtLast Friday at 4:39 PM A unilateral non-BIP9 change at a chosen block height is a serious thing, but anti-ASIC has been part of the RVN philosophy since the whitepaper and is therefore appropriate for that purpose. [Dev-Happy] BlondfrogsLast Friday at 4:39 PM We can use the latest release as an example. It was a non forking release, announced for 2 weeks. and only ~30% of the network has upgraded. TronLast Friday at 4:39 PM @Hans_Schmidt Well said. liqdmetalLast Friday at 4:40 PM I'm not concerned about a "asic hardware problem" so much as I believe it more likely what we are seeing is several big fish miners (perhaps a single really big fish). For now I recommend standing pat on x16r. In the future I can see an algo upgrade fork to keep the algo up to date. If we start fighting against dedicated x16r hashing machines designed and built to secure our network we are more likely to go down in flames. The custom SHA256 computers that make the bitcoin the most secure network in existence are a big part of that security. If some party has made an asic that performs up to par or better than FPGA or GPU on x16r, that is a positive for this network, a step towards SHA256 security levels. It is too bad the community is in the dark regarding their developments. Therefore I think the community has to clarify its stance towards algorithm changes. I prefer a policy that will encourage the development of mining software, bitstreams and hardware by as many parties as possible. The imminent threat of ALGO fork screws the incentive up for developers. JerozLast Friday at 4:40 PM @brutoid the vocal ones are lenient towards asics, but the outcome of the 600+ votes seemed pretty clear. brutoidLast Friday at 4:40 PM This is my confusion TronLast Friday at 4:41 PM More hashes are only better if the cost goes up proportionally. Machines that do more hashes for less $ doesn't secure the network more, and trends towards centralization. JerozLast Friday at 4:41 PM I would argue for polling ever so often as it certainly will evolve dynamically with the state of crypto over time. TronLast Friday at 4:41 PM Measure security in two dimensions. Distribution, and $/hash. liqdmetalLast Friday at 4:41 PM and volume of hash traysiLast Friday at 4:42 PM 45% of the hashrate going to one party is unhealthy, and standing pat on x16r just keeps that 45% where it is. TronLast Friday at 4:42 PM Volume doesn't matter if the cost goes down. For example, lets say software shows up that does 1000x better than the software from yesterday, and everyone moves to it. That does not add security. Even if the "difficulty" and embedded hashes took 1000x more attempts to find. brutoidLast Friday at 4:42 PM My issue is defintely centralization of hash and not so much what machine is doing it. I mine with both GPU and FPGA. Of course, the FPGAs are not on raven TJayLast Friday at 4:44 PM easy solution is just to replace a few of 16 current hash functions, without messing with x21r or whatever new shit TronLast Friday at 4:44 PM How do folks here feel about allowing CPUs back in the game? traysiLast Friday at 4:44 PM Botnets is my concern with CPUs brutoidLast Friday at 4:44 PM Botnets is my concern SpyderDevLast Friday at 4:44 PM Yes please. LSJI07 - MBITLast Friday at 4:44 PM the poll votes seem not very security conscious. More of day miners chasing profits. I love them bless! Imho the future is bright for raven, however these issues if not sorted out now will bite hard long term when asset are on the chain and gpu miners are long gone..... ZaabLast Friday at 4:45 PM How has the testing of restricted assets been on the test net? liqdmetalLast Friday at 4:45 PM Agreed. I dont think x16r is obsolete like that yet however [Dev-Happy] BlondfrogsLast Friday at 4:45 PM @Zaab not enough testing at the moment. HedgerLast Friday at 4:45 PM Yes, how is the Testing going? justinjjaLast Friday at 4:45 PM Like randomX or how are cpus going to be back in the game? TronLast Friday at 4:45 PM @Zaab Just getting started at testing at the surface level (RPC calls), and fixing as we go. ZaabLast Friday at 4:45 PM And or any updates on the review of dividend code created by the community Lokar -=Kai=-Last Friday at 4:45 PM if the amount of hash the unknown pool has is fixed as standarderror indicated then waiting for the community of FPGAers to get onto raven might be advantageous if the fork doesn't hurt FPGAs. ZaabLast Friday at 4:45 PM Can't rememeber who was on it SpyderDevLast Friday at 4:45 PM @Zaab But we are working on it... Lokar -=Kai=-Last Friday at 4:46 PM more hash for votes JerozLast Friday at 4:46 PM @Maldon is, @Zaab TronLast Friday at 4:46 PM @Zaab There are unit tests and functional tests already, but we'd like more. [Dev-Happy] BlondfrogsLast Friday at 4:46 PM @Zaab Dividend code is currently adding test cases for better security. Should have more update on that next meeting KAwARLast Friday at 4:46 PM Absolute democracy seems to resemble anarchy or at least civil war. In EVE online they have a type of community voice that get voted in by the community. ZaabLast Friday at 4:46 PM No worries was just curious if it was going as planned or significant issues were being found Obviously some hiccups are expected More testing is always better! TronLast Friday at 4:47 PM Who in here is up for a good civil war? :wink: ZaabLast Friday at 4:47 PM Tron v Bruce. Celebrity fight night with proceeds to go to the RVN dev fund SpyderDevLast Friday at 4:48 PM Cagefight or mudpit? JerozLast Friday at 4:48 PM talking about dev funds..... :wink: Pho3nix Monk3yLast Friday at 4:49 PM and there goes the conversation.... KAwARLast Friday at 4:49 PM I am trying to be serious... ZaabLast Friday at 4:49 PM Sorry back to the ascii topic! traysiLast Friday at 4:49 PM @Tron What do we need in order to make progress toward a decision on the algo? Is there a plan or a roadmap of sorts to get us some certainty about what we're going to do? LSJI07 - MBITLast Friday at 4:50 PM Could we have 3 no BIP9 votes? No1 Friendly to asics, retain status quo. No2 change to x17r minimal changes etc, with no additional future PoW/algo upgrades. No3. Full Asic resistance x22r and see what happens... :thonk~1: Sounds messy.... TronLast Friday at 4:51 PM Right now we're in research mode. We're building CNv4 so we can run some metrics. If that goes well, we can put together x22rc and see how it performs. It will likely gore everyone's ox. CPUs can play, GPUs work, but aren't dominant. ASICs VERY difficult, and FPGAs will have a tough time. ZaabLast Friday at 4:51 PM Yeah i feel like the results would be unreliable TronLast Friday at 4:51 PM Is this good, or do we lose everyone's vote? PlayHardLast Friday at 4:52 PM Fpga will be dead Lokar -=Kai=-Last Friday at 4:52 PM why isn;t a simple XOR or something on the table? ZaabLast Friday at 4:52 PM The multiple bip9 that is Lokar -=Kai=-Last Friday at 4:52 PM something asic breaking but doesn't greatly complicate ongoing efforts for FPGA being my point. justinjjaLast Friday at 4:52 PM How are you going to vote for x22rc? Because if by hashrate that wouldn't pass. traysiLast Friday at 4:52 PM Personally I like the idea of x22rc but I'd want to investigate the botnet threat if CPUs are allowed back in. TronLast Friday at 4:52 PM XOR is on the table, and was listed in my Medium post. But, the social risk of chain split remains, for very little gain. traysiLast Friday at 4:53 PM @Lokar -=Kai=- A small change means that whoever has 45% can probably quickly adapt. LSJI07 - MBITLast Friday at 4:53 PM Research sounds good. x22rc could be reduce to x22r for simplicity... TronLast Friday at 4:53 PM x22r is a viable option. No CNv4. LSJI07 - MBITLast Friday at 4:53 PM Don't know how much time we have to play with though... Lokar -=Kai=-Last Friday at 4:53 PM if they have FPGAs yes if they have ASIC then not so much, but I guess that gets to the point, what exactly are we trying to remove from the network? PlayHardLast Friday at 4:54 PM Guys my name is Arsen and we designed x16r fpga on bcus. Just about to release it to the public. I am buzzdaves partner. Cryptonight Will kill us But agreed Asic is possible on x16r And you dont need 256 core Cores traysiLast Friday at 4:55 PM Hi Arsen. Are you saying CN will kill "us" meaning RVN, or meaning FPGA? JerozLast Friday at 4:55 PM This is what im afraid of ^ an algo change killing FPGA as I have the feeling there is a big fpga community working on this PlayHardLast Friday at 4:55 PM Fpgas )) whitefire990Last Friday at 4:55 PM I am also about to release X16R for CVP13 + BCU1525 FPGA's. I'm open to algo changes but I really don't believe in CPU mining because of botnets. Any CNv4 shifts 100% to CPU mining, even if it is only 1 of the 22 functions. Lokar -=Kai=-Last Friday at 4:55 PM namely FPGAs that aren;t memory equipped like fast mem not ddr PlayHardLast Friday at 4:55 PM Hbm non hbm Cryptonight whitefire990Last Friday at 4:56 PM Right now with both Buzzdave/Altered Silicon and myself (Zetheron) about to release X16R for FPGA's, then the 45% miner's share will decrease to 39% or less. PlayHardLast Friday at 4:56 PM Will be dead for fpga LSJI07 - MBITLast Friday at 4:56 PM sound so x22r is fpga "friendly" ... more so than asic anyway... PlayHardLast Friday at 4:56 PM But a change must be planned X16r is no way possible to avoid asics TJayLast Friday at 4:56 PM @LSJI07 - MBIT I would say less friendly... whitefire990Last Friday at 4:57 PM As I mentioned in thenest discussion, asic resistance increases with the square of the number of functions, so X21R is more asic resistant than X16R, but both are pretty resistant PlayHardLast Friday at 4:58 PM Yeah more algos make it heavier on ASIC DirkDiggler (Citadel Architect)Last Friday at 4:58 PM My interpretation of the whitepaper was that we used x16r as it was brand new (thus ASIC resistant), and that was to ensure a fair launch... We've launched... I don't like the idea of constantly forking to avoid the inevitable ASICs. x16r was a great "experiment" before we had any exchange listings... that ship has sailed though... not sure about all these x22rs lmnop changes KAwARLast Friday at 5:00 PM I believe that it is easier to change the direction of a bicycle than an oil tanker. We feel more like a train. We should lay out new tracks and test on them and find benefits that are acceptable to everyone except train robbers. Then open the new train station with no contentious feelings except a silently disgruntled minority group. ??? Hans_SchmidtLast Friday at 5:01 PM The most productive action the community can do now re ASICs is to voice support for the devs to make a non-BIP9 change at a chosen block height if/when the need is clear. That removes the pressure to act rashly to avoid voting problems. LSJI07 - MBITLast Friday at 5:01 PM Thats why im proposing to fork at least once to a more asic resistant algo (but FPGA "friendly/possible"), with the proviso ideally that no more PoW algo forks are require to provide future ASICs some opportunity to innovate with silicon and efficiency. TJayLast Friday at 5:01 PM folks should take into account, that high end FPGAs like BCU1525 on x16r can't beat even previous gen GPUs (Pascal) in terms of hash cost. so they aren't a threat to miners community PlayHardLast Friday at 5:02 PM A proper change Requires proper research eyz (Silence)Last Friday at 5:02 PM Just so I'm clear here, we are trying to boot ASICS, don't want CPUs because of Botnets, and are GPU and FPGA friendly right? PlayHardLast Friday at 5:02 PM It is not a quick one day process eyz (Silence)Last Friday at 5:02 PM If there is a bip9 vote there needs to be a clear explanation as I feel most in the community don't understand exactly what we are trying to fix TronLast Friday at 5:03 PM @Hans_Schmidt I like that route. It has some game theoretics. It gives time for miners to adapt. It is only used if needed. It reduces the likelihood of ASICs dominating the network, or even being built. [Dev-Happy] BlondfrogsLast Friday at 5:03 PM Hey guys. great convo. We are of course looking to do the best thing for the community and miner. We are going to be signing off here though. justinjjaLast Friday at 5:03 PM TJay that comes down to power cost. If your paying 4c/kw gpus all the way. But if your a home miner in europe an fpga is your only chance LSJI07 - MBITLast Friday at 5:03 PM @Hans_Schmidt How do we decide the block limit and when sufficient evidence is available? I would say we have had much compelling information to date... [Dev-Happy] BlondfrogsLast Friday at 5:03 PM Thanks for participating. and keep up the good work :smiley: Have a good weekend. CAWWWW TronLast Friday at 5:03 PM I haven't seen any compelling evidence of ASICs - yet. Pho3nix Monk3yLast Friday at 5:03 PM :v: JerozLast Friday at 5:04 PM I suggest to continue discussion in #development and #thenest :smiley: thanks all! TronLast Friday at 5:04 PM Cheers everyone! KAwARLast Friday at 5:04 PM Agree with Hans. DirkDiggler (Citadel Architect)Last Friday at 5:04 PM thanks Tron Pho3nix Monk3yLast Friday at 5:04 PM Ending here. continue in Nest if wanted DirkDiggler (Citadel Architect)Last Friday at 5:04 PM I am waiting for compelling evidence myself.
Bitcoin mining uses as much electricity as a small country. Many people hate it for this reason, its one of the more popular arguments against crypto currencies. Will crypto mining kill polar bears? I think not. I think it will help save polar bears. "Bear" with me. Germany produces a significant part of its electricity from renewable energy: wind and solar. As we all know, these sources are intermittent and seasonal, as is demand. When the share of renewable energy in the overall energy mix becomes large enough, the result is inevitable: temporary and seasonal overcapacity. This isnt just theoretical, energy prices in germany and the UK where effectively negative last Christmas: http://www.businessinsider.com/renewable-power-germany-negative-electricity-cost-2017-12//?r=AU&IR=T As explained in the above article, this isnt a rare freak occurrence, its expected and this will have to be become much more common if as a society, we want to transition away from fossil fuels. Because to do that we need (much) more renewable energy sources. A study I saw for Germany calculated they needed at least 89% more capacity, just to handle peak loads. But that also implies an incredible amount of overcapacity when demand isnt anywhere near peak, or when supply is above average due to favorable weather. Storing excess renewable electricity, in most places is very expensive and inefficient. So much so that its rarely even done. This is a major problem. Wind turbines are therefore feathered, solar panels turned off, excess electricity dumped in giant electrical heaters, offered for free or even offered at negative prices. Renewable energy may have become cheaper than other forms per KWH, but thats only if when you can sell all of your production. And its only true if the consumption occurs near the renewable energy source and not 100s or 1000s of kilometers further. Building capacity that can only be used 50% or even 10% of the time, or building infrastructure to store surplus electricity is still very expensive, as is transporting renewable energy over long distances. I know what you're thinking. Mining wont help here, because mining intermittently is something that seems crazy today; miners keep their expensive machines on 24/7. But thats only because today, the overall cost structure of a (bitcoin) miner is heavily tilted towards hardware depreciation. Particularly for anyone paying retail prices for mining asics. This will change completely, because of two related reasons: 1) mining efficiency improvements will taper off. Mining asics have been progressing extremely rapidly, from being based on CPUs and FPGA's, to using 20 year old obsolete 180nm process technology in the first asics, to state of the art 16nm chips today. This has resulted in at least a million fold improvement in efficiency in just a few years, which in turn lead to hardware investments that needed to be recovered in a few months or even weeks (!) before they were obsolete. Opportunity cost has been so high, that miners have literally chartered 747s to transport new mining equipment from the manufacturer in China to their datacenters in the US. This cant and wont last. 12nm and 7nm asics are about to be produced, or are being produced now. It doesnt get better than that today, and it wont for many years to come. Moore's law is often cited to show efficiency will keep going up. That may be true, but until now the giant leaps we have seen had nothing to do with moore's law, which "only" predicts a doubling every 18 months. Moore's law is also hitting a brick wall (you cant scale transistors smaller than atoms), and only states that transistor density increases. Not that chips become more efficient or faster, which increasingly is no longer happening (new cpu's are getting more cores, but run at comparable speeds and comparable power consumption to previous generations). What all this means is that these upcoming state of the art mining asics will remain competitive for many years, at least 3, possibly more than 5 years, and thus can be used and written off over that many years. But they will still consume electricity during all those years, shifting the overall costs from hardware to electricity. 2) Mining is still too profitable (for anyone making their own asics) and mining hardware is therefore still too expensive (for everyone else) Miner hardware production rate simply hasnt yet been able to keep up with demand and soaring bitcoin prices. This leads to artificially low mining difficulty, making mining operationally profitable even with expensive electricity, and this also leads to exuberant hardware profit margins. You can see this easily, just look at the difficulty of bitcoin. When the price dropped by 70%, did you see a corresponding drop in difficulty? No, no drop at all, it just keeps growing exponentially. That only makes sense because we are not yet near saturation, or near marginal electricity costs for bitmain & Co. Its not worth it yet for them to turn off their miners. Its not even worth it yet for residential miners. Another piece of evidence for this, is bitmains estimated $4 billion profit. But mining is a zero sum game, over time, market forces will drive hardware prices and the mining itself to become only marginally profitable. We're clearly not close to that -yet. You might think so as a private miner, but thats only because you overpaid for your hardware. Lets look at todays situation to get an idea. An Antminer S9 retails for $2300 and uses ~1300W at the wall. If you write off the hardware over a year, electricity and hardware costs balance out at an electricity price of $0.2/KWH. Anything below that, and hardware becomes the major cost. But how will that evolve? As difficulty keeps going up, bitcoin mining revenue per asic will decline proportionally, until demand for mining asics will eventually taper off. To counter that, prices of asics will be lowered until they approach marginal production costs, which by my estimate is closer to $200 than $2000. Let say a 1300W S9 equivalent at that point gets sold at $400 leaving bitmain a healthy profit margin; that would mean each year a miner would spend 5x more on electricity than on hardware. Hardware will remain competitive for more than a single year though. Say you write it off over 3 years, now you're spending 15x more on electricity than on hardware. Intermittent mining like 50% of the time, but with free or virtually free electricity will become economical long before that. By now, I will hopefully have convinced you of the viability of mining with intermittent excess renewable energy; intermittent mining with renewable energy will not only become viable, it will become the only way to do it profitably. Renewable energy at the source is already cheaper than any carbon burning source. Even in Quatar, they install solar plants because its cheaper than burning their own gas. Its transporting and storing the electricity that usually is the problem. Gas can easily be transported and stored. Wind and solar energy can not. And thats a massive problem for the industry. But mining doesnt need either. You can mine pretty much anywhere and anytime. All you need besides electricity, is a few containers and an internet connection for a solar plant or wind farm to monetize excess energy. Moreover, mining is a zero sum game, a race to the bottom. As long as its profitable for green energy providers to deploy more hardware (which will be true as long as they can at least recover their hardware investment), difficulty will go up. Until it becomes unprofitable for anyone who has to pay for his electricity. No one gives oil, coal or gas away for free, so anyone depending on those sources of electricity, can not remain competitive. If bitcoin price were to go up so much, that there isnt enough renewable electricity production in the world to accommodate the hashrate, bitcoin miners will simply install more solar and wind farms. Not because of their ecological awareness, but because it makes the most financial sense. And during peak demand periods, why wouldnt they turn off the miners and sell their electricity to the grid for a premium? Basically crypto mining would fund renewable energy development, and solve the exact problem laid out in the article linked above: provide overcapacity of renewable energy to handle grid peak loads, without needing any government funding or taxation on carbon based sources, without needing expensive and very inefficient energy storage. From the perspective of a green energy producer, energy storage, like a battery or hydrogen production, is just an expensive and intermediate step between producing electricity and getting paid for that electricity. Crypto mining will do the same thing, converting excess electricity in to cash, only much more efficiently. TL:DR, deploying more renewable electricity overcapacity is both very expensive and very necessary if we want to save polar bears. Financing for these large scale green energy projects will either have to come from tax payer money to store or subsidise the largely unused excess electricity, or it will come from crypto mining. Market forces will drive crypto mining to use the cheapest energy. Renewable energy already is cheaper per KWH than carbon based power, and nothing is cheaper than excess and thus free (or negative value) renewable energy. Bitcoin mining's carbon foot print will therefore become ~zero. If you take in to account the effect of financing and subsidizing large scale renewable energy development that can also be used to supply the grid during peak demand periods, its carbon footprint will be hugely negative. BTW, if you wonder what Blockchains LLC is going to do with 61K acres near Tesla's factory; my guess is solar plants and crypto mining. Expect to see renewable energy development and crypto mining to merge in to one single industry. Check out envion to get a glimpse of this future. Im not endorsing their token as an investment, I havent researched it at all, but the market they are going after is a very real one and its about to explode.
Crypto and the Latency Arms Race: Crypto Exchanges and the HFT Crowd
News by Coindesk: Max Boonen Carrying on from an earlier post about the evolution of high frequency trading (HFT), how it can harm markets and how crypto exchanges are responding, here we focus on the potential longer-term impact on the crypto ecosystem. First, though, we need to focus on the state of HFT in a broader context.
Conventional markets are adopting anti-latency arbitrage mechanisms
In conventional markets, latency arbitrage has increased toxicity on lit venues and pushed trading volumes over-the-counter or into dark pools. In Europe, dark liquidity has increased in spite of efforts by regulators to clamp down on it. In some markets, regulation has actually contributed to this. Per the SEC:
“Using the Nasdaq market as a proxy, [Regulation] NMS did not seem to succeed in its mission to increase the display of limit orders in the marketplace. We have seen an increase in dark liquidity, smaller trade sizes, similar trading volumes, and a larger number of “small” venues.”
Why is non-lit execution remaining or becoming more successful in spite of its lower transparency? In its 2014 paper, BlackRock came out in favour of dark pools in the context of best execution requirements. It also lamented message congestion and cautioned against increasing tick sizes, features that advantage latency arbitrageurs. (This echoes the comment to CoinDesk of David Weisberger, CEO of Coinroutes, who explained that the tick sizes typical of the crypto market are small and therefore do not put slower traders at much of a disadvantage.) Major venues now recognize that the speed race threatens their business model in some markets, as it pushes those “slow” market makers with risk-absorbing capacity to provide liquidity to the likes of BlackRock off-exchange. Eurex has responded by implementing anti-latency arbitrage (ALA) mechanisms in options: “Right now, a lot of liquidity providers need to invest more into technology in order to protect themselves against other, very fast liquidity providers, than they can invest in their pricing for the end client. The end result of this is a certain imbalance, where we have a few very sophisticated liquidity providers that are very active in the order book and then a lot of liquidity providers that have the ability to provide prices to end clients, but are tending to do so more away from the order book”, commented Jonas Ullmann, Eurex’s head of market functionality. Such views are increasingly supported by academic research. XTX identifies two categories of ALA mechanisms: policy-based and technology-based. Policy-based ALA refers to a venue simply deciding that latency arbitrageurs are not allowed to trade on it. Alternative venues to exchanges (going under various acronyms such as ECN, ATS or MTF) can allow traders to either take or make, but not engage in both activities. Others can purposefully select — and advertise — their mix of market participants, or allow users to trade in separate “rooms” where undesired firms are excluded. The rise of “alternative microstructures” is mostly evidenced in crypto by the surge in electronic OTC trading, where traders can receive better prices than on exchange. Technology-based ALA encompasses delays, random or deterministic, added to an exchange’s matching engine to reduce the viability of latency arbitrage strategies. The classic example is a speed bump where new orders are delayed by a few milliseconds, but the cancellation of existing orders is not. This lets market makers place fresh quotes at the new prevailing market price without being run over by latency arbitrageurs. As a practical example, the London Metal Exchange recently announced an eight-millisecond speed bump on some contracts that are prime candidates for latency arbitrageurs due to their similarity to products trading on the much bigger CME in Chicago. Why 8 milliseconds? First, microwave transmission between Chicago and the US East Coast is 3 milliseconds faster than fibre optic lines. From there, the $250,000 a month Hibernia Express transatlantic cable helps you get to London another 4 milliseconds faster than cheaper alternatives. Add a millisecond for internal latencies such as not using FPGAs and 8 milliseconds is the difference for a liquidity provider between investing tens of millions in speed technology or being priced out of the market by latency arbitrage. With this in mind, let’s consider what the future holds for crypto.
Crypto exchanges must not forget their retail roots
We learn from conventional markets that liquidity benefits from a diverse base of market makers with risk-absorption capacity. Some have claimed that the spread compression witnessed in the bitcoin market since 2017 is due to electronification. Instead, I posit that it is greater risk-absorbing capacity and capital allocation that has improved the liquidity of the bitcoin market, not an increase in speed, as in fact being a fast exchange with colocation such as Gemini has not supported higher volumes. Old-timers will remember Coinsetter, a company that, per the Bitcoin Wiki , “was created in 2012, and operates a bitcoin exchange and ECN. Coinsetter’s CSX trading technology enables millisecond trade execution times and offers one of the fastest API data streams in the industry.” The Wiki page should use the past tense as Coinsetter failed to gain traction, was acquired in 2016 and subsequently closed. Exchanges that invest in scalability and user experience will thrive (BitMEX comes to mind). Crypto exchanges that favour the fastest traders (by reducing jitter, etc.) will find that winner-takes-all latency strategies do not improve liquidity. Furthermore, they risk antagonising the majority of their users, who are naturally suspicious of platforms that sell preferential treatment. It is baffling that the head of Russia for Huobi vaunted to CoinDesk that: “The option [of co-location] allows [selected clients] to make trades 70 to 100 times faster than other users”. The article notes that Huobi doesn’t charge — but of course, not everyone can sign up. Contrast this with one of the most successful exchanges today: Binance. It actively discourages some HFT strategies by tracking metrics such as order-to-trade ratios and temporarily blocking users that breach certain limits. Market experts know that Binance remains extremely relevant to price discovery, irrespective of its focus on a less professional user base. Other exchanges, take heed. Coinbase closed its entire Chicago office where 30 engineers had worked on a faster matching engine, an exercise that is rumoured to have cost $50mm. After much internal debate, I bet that the company finally realised that it wouldn’t recoup its investment and that its value derived from having onboarded 20 million users, not from upgrading systems that are already fast and reliable by the standards of crypto. It is also unsurprising that Kraken’s Steve Hunt, a veteran of low-latency torchbearer Jump Trading, commented to CoinDesk that: “We want all customers regardless of size or scale to have equal access to our marketplace”. Experience speaks. In a recent article on CoinDesk , Matt Trudeau of ErisX points to the lower reliability of cloud-based services compared to dedicated, co-located and cross-connected gateways. That much is true. Web-based technology puts the emphasis on serving the greatest number of users concurrently, not on serving a subset of users deterministically and at the lowest latency possible. That is the point. Crypto might be the only asset class that is accessible directly to end users with a low number of intermediaries, precisely because of the crypto ethos and how the industry evolved. It is cheaper to buy $500 of bitcoin than it is to buy $500 of Microsoft shares. Trudeau further remarks that official, paid-for co-location is better than what he pejoratively calls “unsanctioned colocation,” the fact that crypto traders can place their servers in the same cloud providers as the exchanges. The fairness argument is dubious: anyone with $50 can set up an Amazon AWS account and run next to the major crypto exchanges, whereas cheap co-location starts at $1,000 a month in the real world. No wonder “speed technology revenues” are estimated at $1 billion for the major U.S. equity exchanges. For a crypto exchange, to reside in a financial, non-cloud data centre with state-of-the-art network latencies might ironically impair the likelihood of success. The risk is that such an exchange becomes dominated on the taker side by the handful of players that already own or pay for the fastest communication routes between major financial data centres such as Equinix and the CME in Chicago, where bitcoin futures are traded. This might reduce liquidity on the exchange because a significant proportion of the crypto market’s risk-absorption capacity is coming from crypto-centric funds that do not have the scale to operate low-latency strategies, but might make up the bulk of the liquidity on, say, Binance. Such mom-and-pop liquidity providers might therefore shun an exchange that caters to larger players as a priority.
Exchanges risk losing market share to OTC liquidity providers
While voice trading in crypto has run its course, a major contribution to the market’s increase in liquidity circa 2017–2018 was the risk appetite of the original OTC voice desks such as Cumberland Mining and Circle. Automation really shines in bringing together risk-absorbing capacity tailored to each client (which is impossible on anonymous exchanges) with seamless electronic execution. In contrast, latency-sensitive venues can see liquidity evaporate in periods of stress, as happened to a well-known and otherwise successful exchange on 26 June which saw its bitcoin order book become $1,000 wide for an extended period of time as liquidity providers turned their systems off. The problem is compounded by the general unavailability of credit on cash exchanges, an issue that the OTC market’s settlement model avoids. As the crypto market matures, the business model of today’s major cash exchanges will come under pressure. In the past decade, the FX market has shown that retail traders benefit from better liquidity when they trade through different channels than institutional speculators. Systematic internalizers demonstrate the same in equities. This fact of life will apply to crypto. Exchanges have to pick a side: either cater to retail (or retail-driven intermediaries) or court HFTs. Now that an aggregator like Tagomi runs transaction cost analysis for their clients, it will become plainly obvious to investors with medium-term and long-term horizons (i.e. anyone not looking at the next 2 seconds) that their price impact on exchange is worse than against electronic OTC liquidity providers. Today, exchange fee structures are awkward because they must charge small users a lot to make up for crypto’s exceptionally high compliance and onboarding costs. Onboarding a single, small value user simply does not make sense unless fees are quite elevated. Exchanges end up over-charging large volume traders such as B2C2’s clients, another incentive to switch to OTC execution. In the alternative, what if crypto exchanges focus on HFT traders? In my opinion, the CME is a much better venue for institutional takers as fees are much lower and conventional trading firms will already be connected to it. My hypothesis is that most exchanges will not be able to compete with the CME for fast traders (after all, the CBOE itself gave up), and must cater to their retail user base instead. In a future post, we will explore other microstructures beyond all-to-all exchanges and bilateral OTC trading. Fiber threads image via Shutterstock
A lot has happened in the past few weeks, and this time I actually paid enough attention to write it down. Some general stats (and changes since last time): Mining difficulty: 818,109,875 (0.00%) (next: ~800,107,457 ) (-2.6%) Estimated hashrate: 2.61 Th/s (-37.41%) Current average reward time: 21.88 minutes (+59.35%) Tokens minted: 3,327,300 0xBTC (+1.41%) Token holders: 4556 holders (+1.37%) Total contract operations: 188399 txs (+0.27%) Source: https://0x1d00ffff.github.io/0xBTC-Stats/?page=stats Tokens required to be a top holder (and changes since last time): Top 10: 36197.32435793 0xBTC (0.00%) Top 25: 23614.66689656 0xBTC (+4.04%) Top 50: 14174 0xBTC (0.00%) Top 100: 7159.1234115 0xBTC (+4.35%) Top 200: 2994.7652797 0xBTC (+1.49%) Top 300: 1550 0xBTC (+0.45%) Top 500: 650.02267112 0xBTC (+7.08%) Top 1000: 165.9 0xBTC (+4.86%) Source: https://etherscan.io/token/0xb6ed7644c69416d67b522e20bc294a9a9b405b31#balance Recent events:
The largest happening of the past two weeks is without a doubt Infernal_toast coming public in an interview he gave to Ethex. Not irrelevant is the date on which he did so - the 10th anniversary of the publication of the Bitcoin whitepaper. In addition to the holy image of our Royal Toastiness, there's also some interesting crypto stuff in the video. https://youtu.be/fKMDSc7-AA4
Infernal_toast started a series called "Tokens with Toast", where he looks at the contracts of various ERC20 tokens and comments on them. The second video in the series looks at the contract of Oyster (PRL), which recently screwed over everyone that invested in it. The deployer had not locked himself out of the contract, so he could re-open the ICO and buy freshly-minted PRL tokens at ICO prices, which he promptly proceeded to dump on everyone. Highly educational stuff, recommended watching for everyone. https://youtu.be/iOTI5oslIbU
Infernal_toast is the gift that keeps on giving, and in addition to the previous two things, he has also created a video preview of the LavaWallet. In case you're not in the know, then LavaWallet will allow people to transfer ERC20 tokens without having to hold ETH and by paying for the tx with ERC20 tokens instead, greatly simplifying transfers on the Ethereum network. https://youtu.be/qZwKrAhs8Xc
The FPGA miners have switched their rigs to greener pastures and 0xBTC's hashrate has gone down considerably. 0x1d00ffff's website tracks the hashrate average between adjustments and since we're currently lining up for an adjustment, then the data displayed above is still somewhat skewed by the work that the FPGA's submitted a few hundred blocks ago. The charts he provides on his site are far more accurate if you're more interested in the current situation (https://0x1d00ffff.github.io/0xBTC-Stats/?page=graphs). The actual hashrate is more in the region of 400Gh/s and the blocktime is upwards of 100 minutes. As such, new supply is rapidly drying up and we're probably up for a grind towards a new adjustment just like during the summer.
An idea was formed in the discord to do some paid marketing in the form of an advertorial on CCN. The 9 ETH required were raised in only a few days, with half of the amount donated by a generous whale - shout out to that homie. A community member going by Moonboy3000 (mirin' the name), who's a professional writer, volunteered to write the article and it'll hopefully be published early next week. The plan is to coincide the release of the article with press releases to various other news outlets as well as a refreshed version of the 0xBitcoin homepage that GeoffedUP has been working on. You can browse through the discord for the article and see the work-in-progress on the website at https://geoffedup.github.io/0xbitcoin.github.io/.
Mr F wrote a letter to a journalist who's been writing articles about wBTC and the future of ETH-miners after Casper. Given his choice of topics, he should be interested in 0xBitcoin, so we might get some free exposure. No guarantees though, we'll just have to wait and see.
Nic has been running 0xBitcoin ads on Youtube. Specifically, he's been running the "History of Cryptocurrency" video that toast made (https://youtu.be/Xf8W-C9fN5M). What's remarkable is that the video is over 10 minutes in length and over 20% of the people that had it displayed to them watched it to the end. That's probably on account of the fine targeting nic has done on the ads.
I am not a wealthy person by any means, but Bitcoin has helped. I discovered Bitcoin via a post on overclock.net on April 27th, 2011. I believe the price was about $1.50/coin then. I read the posts about people mining them, did some research, and immediately started my Radeon card mining them. I had a 4770 back then. There was an exchange to sell Bitcoins for linden dollars (Second Life currency) and then I could sell those for paypal dollars. Within a day I had proven to my wife that I could make money with this Bitcoin thing. Despite us being in a position where we couldn't even pay our credit cards, I took the $1100 we had and bought 4 5850's, some power supplies, and some cheap craigslist computers. I figured that if this whole Bitcoin thing failed miserably, at least I had some decent computer hardware I could resell and recover most of the cost. I immediately sold one 5850 for greater-than-market value since they were in demand and I needed the money, and started the other 3 mining. At one point, I was mining nearly 8 coins a day. I bought a few more cards as time went on and continued GPU mining for as long as it was viable. This whole thing saved us financially. I was able to sell the Bitcoins and settle on my unpayable credit card debts. I held on to a few during the crash but managed to sell most of them at $10 or more, fortunately. After that I started saving them, since they were worth so little. I bought some of the early BFL FPGA miners, the ones that were measured in MHashes not GHashes. After mining with those for a while and then selling them to someone who wanted them more than I did, I had more than 450 BTC. I took the plunge and pre-ordered BFL's latest offerings, the 60GH singles, the day they were available, becoming one of the first on the preorder list. Little did I know I would have been much better off just holding those coins... Regardless, I did eventually receive those singles, and managed to get about 225 BTC out of them before they were no longer worth running. I've been slowly selling the stash as we needed for remodel projects around the house and for miscellaneous expenses, though I finally no longer need to do so, as we've been able to pay off more debts and have more income than expenses each month. Now I've got a nice pile of savings, and I'm hoping to someday be able to use it to buy a better house in a better neighborhood. I generally don't tell people that I have just about all my liquid assets in Bitcoin, as they would call me crazy. They might be right. But it's a risk I'm willing to take. I do have some equity in my house, and some retirement accounts, but neither is worth more than my BTC stash. So that's MY story, what's yours?
Cloud Mining – Make Earnings With Low Risk And Lower Costs
Bitcoin has gone through a bear market for more than a year and finally welcomed a strong market rally. Since April, the winning streak made Bitcoin up to $9,073 at a point, risen by 170.6% within the year, doubling the currency price. As the market gradually picks up, the number of contract trading users is also increasing, meanwhile, mining and related industries are slowly rising, the fast sold out of Antminer S17 series since on-sale is the best proof. 58COIN launched BitHash services mainly focusing on miner custody and cloud mining. Recently, the periodic cloud mining service will be launched, starting from 1T and provides flexible period choices for various investors. Whether being mining or miner custody, it is inseparable from the mining machine, then what is mining? Do you want to make money in mining? What are the determinants? Let's briefly analyze it: What is mining? What is a miner? Everyone knows that Bitcoin is a peer-to-peer payment system, and its core is trading. We need to use a ledger to keep track of accounts, just like the bank helps bookkeeping when we transfer money at a bank. The one that acts as the bookkeeper is called a miner in Bitcoin. It doesn't matter what the bookkeeping method is, it is the specific bookkeeper – miner that counts. Since the Bitcoin system does not have a central node like a bank, everyone can compete for the position of a miner and get the right to book the bitcoin system. However, if everyone is coming to compete, who should be entitled the right? How can you prove that you did work? How to ensure that the miner does not record the false account? The inventor of Bitcoin, Nakamoto, has designed an intelligent method called Proof-of-Work (PoW) system. The Bitcoin system will let everyone involved solve a math problem - calculate the hash value. The one who first solves the problem will be recognized by the whole network and get the reward, and the speed of solving it depends on the high and fast computing power. In a word, the mining is actually using a machine to participate in a math game, whoever calculates the answer first will get the bitcoin reward. The mining equipment is called the ”miner”. Due to the increasing difficulty of computing power, the miner is constantly upgraded, experiencing the development of CPU – GPU – FPGA – ASIC – mining pool. “Who” determines the mining earnings? There are several factors that affect the earnings of mining. The first is the currency price, obviously, the higher the currency price, the more profitable the mining is; the second is the difficulty of mining, if the mining difficulty rises slowly, more mining earnings will be got; the third is the cost, low mining costs can make high profits, and the cost here refers to the purchase cost of miners and operating costs, including miner fees, labor costs, O&M costs, electricity costs, etc.; the last factor is the computing power, the higher the computing power in a given period of time, the more coins will be mined. Therefore, it is very similar to speculating coins. The key point of making money by mining is: buy low and sell high! If you have a very low electricity bill, you can buy a miner to mine. Besides, if you can buy low-cost computing power, you can also mine. BitHash – The Optimal Choice for Conservative Investors After seeing the recovery of the currency market, many individual investors are eager for trading the contracts, while the new investors are preparing to enter the mining market. However, there are some obstacles that individual investors may encounter when mining: 1) You may be not able to see the price of the market in real time; 2) You may not be capable of finding a suitable large-scale power supply; 3) You cannot make sure the 24-hour operation and maintenance of the miner. But this problem has been solved, the BitHash service launched by 58COIN has all the necessities required for making profits in mining, for example, the first batch of the hot sale Antminer S17 and S17Pro series, with high mining power and low electricity costs and PPS+ earnings distribution model, were sold at about 15,000 CNY (approx. $2,189.33) per miner, and users do not need to be responsible for the operation and maintenance of the miner. Such service is indeed profitable for investors. Therefore, the miner custody service was sold out as soon as it was launched. 58COIN provides tailored services for diverse investors. If you think that the cost of one-time expenditure for the miner custody is too high, you can choose cloud mining – a product that allows users to lease and enjoy earnings based on each T hashrate or designated period of time accordingly. Starting from 306CNY/T (approx. $44.33/T) and with no upper limit, investors, whether being large, medium, or small can invest according to their financial plan. Due to the hot sales of the buy-and-mine cloud mining, the platform added a 1,000T cloud mining yesterday to meet the needs of users. According to 58COIN, it will launch a periodic cloud mining service in the near future. Compared with the perpetual cloud mining, this new service boasts more optional periods and a shorter static payback period. With low entry entering requirements and reasonable pricing, most investors have the opportunity to get permanent earnings at lower costs. Regarding this issue, Steven, the Operations Director of 58COIN, said: “Following the rebound in the Bitcoin price, the static payback period is rapidly compressed, it will be a good choice for conservative investors to invest in cloud mining.” Website: https://www.58ex.com/ Facebook: https://www.facebook.com/coin.58COIN Twitter: https://twitter.com/58_coin Telegram: https://t.me/official58
Hello folks! I have been doing a lot of reading about the SmartCash cryptocurrency recently. SmartCash claims to be a private cryptocurrency that also focuses on a community-centered model. However, a lot of what I've found has concerned me. But first - I'd like to invite anyone with an opposing point of view to share their opinions after reading this. I'm not in this to spread baseless accusations, I just want an educated conversation. Please do not downvote simply because you disagree; instead, read my post, make a comment and discuss the topic with me. I've sent PM's to several people who support SmartCash in order to let them make their opinions clear. That said, let's go through this point by point - we'll start with the distribution. From the official SmartCash website:
Traditional cryptocurrencies, like Bitcoin, reward only the miners while neglecting the other actors that play an active role in maintaining, developing and promoting the project. SmartCash is a community-centric cryptocurrency, where community and development comes first. 80% of the block reward has been allocated to fund SmartHive community proposals as well as the Hive Teams. 20% of the remaining block reward has been allocated to Mining (5%) and SmartRewards (15%).
In the name of staying unbiased, I am going to acknowledge my ideological beliefs about mining, as well as my own personal biases as a miner, that miners should receive 100% of the rewards for the work they did. With this out of the way, let's discuss the mathematics of SmartCash's block reward distribution. 80% goes to community projects, 15% goes to SmartRewards (staking equivalent, but not used for consensus), and 5% goes to the miners. In theory, this will lead to 95% less miners than normal, ensuring miners get paid roughly the same. In practice, this won't necessarily be true. But the existence of fewer miners also presents many issues. There have been several 51% attacks against cryptocurrencies that give all block rewards to miners - Krypton in 2016, Feathercoin in 2013, and Dashcoin (a cryptonote fork DSH, not DASH) by MinerGate in April of 2017. Chain consensus with SmartCash is determined entirely by proof of work, not proof of stake; therefore you do not need to own any coins in order to attack the network and achieve 51% hashrate. In the case of a cryptocurrency that gives miners 5% of the block rewards, achieving 51% of nethash is quite easy, because fewer people will be mining. SmartCash's current network hashrate is 10 Th/s or 10 trillion hashes per second; a conservative estimate for a GTX 1080's hashrate is 1 Gh/s or 1 billion hashes per second. Therefore, the network is currently secured by the equivalent of 10,000 GTX 1080 GPUs. Given that this GPU costs approximately $500, it would take about $5 million to conduct a 51% attack on the network. At nicehash prices of ~0.3 BTC/TH/S/Day, this would cost ~$60,000 per day, taking into account a necessary raise in the offered price to 'persuade' more people to switch to Keccak algo, as only 2TH/s is currently for sale on Nicehash. Even worse, Keccak (Smartcash’s hashing algorithm) was specifically designed to be ASIC-friendly. From the official Keccak website:
Keccak, the winner of the SHA-3 competition, is blazing fast when implemented on dedicated (ASIC) or programmable (FPGA) hardware.
So if somebody ever modified a Keccak ASIC for mining, it would also be easy to conduct a 51% attack. Let’s move on. Remember how they said that 80% of the block rewards go to a community fund? That address is here, and it controls 55% of the SmartCash in existence. This address is used to fund proposals that are voted on by the community. The problem is that the private key to this address is owned by the developers - and regardless of their past honesty, this system still requires trust in them. A trust-required system is directly contrary to the principles of cryptocurrency. The developers, despite your trust in them, could still sell some of those coins at any time. Next up we'll discuss SmartCash's privacy. SmartCash uses the Zerocoin protocol for privacy, as it was forked from Zcoin. Zerocoin breaks the link between sender and receiver, but unlike Zerocash and ringCT, it does not hide the transaction amount. Furthermore, SmartCash's privacy is optional, and it is transparent by default. Transparency by default is bad for the following reasons: (1) it reduces the anonymity set (2) it makes private transactions inherently more suspicious (3) it allows sender to harm the privacy of recipient (4) it makes it impossible to hide your wealth (5) it makes the currency non-fungible. My last concern with SmartCash is the coin distribution. Currently, as shown on the SmartCash Rich List, the top 100 addresses control 98.42% of funds. This is a highly unbalanced situation, and it also means that the vast majority of SmartCash wealth is held by a small number of people. With Bitcoin, the top 100 addresses control roughly 32% of funds, which is not perfect, but certainly much better. In summary, SmartCash is a great idea - a private, community-oriented cryptocurrency - but it is executed in a suboptimal manner. I would be happy to hear your opinions on this, whether you agree or disagree. -KnifeOfPi2
This comment on BTC1 replay protection deserves its own thread.
I asked a question in another thread about what people meant by the blacklisting address on the BTC1 fork. User u/PM_ME_FPGA_TRICKS posted an excellent response which I wanted to share with the broader community since it explains not only this issue, but also how the replay protection is supposed to work for BTC1. I didn't realize how laughably bad their solution to replay protection really was until reading this. The comment is at: https://www.reddit.com/Bitcoin/comments/74oi26/2x_is_already_dead_miners_will_not_mine_a_sha256/do05q0g/ and the text of the comment is below: Only thing that is confirmed is that there is a blacklisted address in btc1, that cannot receive coins. If you try to send coins to this address, your transaction is invalid and cannot be mined. The code is in the repo in primitives/transaction.cpp So, if you want to split your BTC into BTC and BTC1, then you would use your BTC client to send some money to the blacklisted address. This transaction would go through on BTC, and you would lose the funds you sent to the address, but your change would come back replay protected, and locked to BTC fork only. The transaction would be blacklisted on BTC1 and your coins would stay where they were. This is more complex than it sounds, because if you only send 1 sat to the blacklist, your wallet won't send all your coins in the transaction, so, not all your funds will be replay protected. This requires manual coin control or multi-recipient sending, which n00bs could easily screw up. It's also a minor security risk for LN. Imagine you are using LN on BTC1. LN works by the customer and supplier agreeing to put money into an escrow address, and then in 1 final transaction the escrowed funds are divided up with final payment going to the supplier and change going to the customer. If the supplier and customer do not do any business, then the escrow times out, and the customer can recover their funds from the escrow address. However, if the customer sets up the LN payment to send change to the blacklisted address, then the channel's final payment will be blacklisted, the escrow account will time out and the funds can be recovered by the customer. This is, of course, trivial to work around - any LN client on BTC1 just needs to check that the address isn't blacklisted. Hardly rocket science, but still undesirable, and a source for code bloat and potential errors.
To arms Bitcoin community! Help us to complete this mining installation for the Zürich MoneyMuseum. We are not asking for funds. Only your expertise needed! 20$ tip if you give us the relevant clue to solve or mitigate our main problem. Nice pictures of the exhibition inside as well…
Edit: A big thank you to all people who helped us we can now mine true pps with diff1! The people in this thread which have helped most have been awarded. I want to mention also the operator of btcmp.com denis2342 and Luke-Jr. Actually looking at the miner screen in the Linux terminal helped a lot ;-). The pool constantly resigned to stratum with variable difficulty. We can now mine true pps with diff1. Getwork with long polling seems to be default after disabling stratum... We will probably post again, when there is a video of the installation in action... Again many thanks. Learned a lot. Edit: Thank you for all the answeres so far! We will try different things now and report back. Tip bounty will be distrubuted as soon as we found out what finally does the trick. Ths could take a few days. The offerd tip will be distributed and very likeley a few others as well. First of all, let me tell you that the Bitcoin Exhibition at the Zürich MoneyMuseum is most likely the biggest and most diverse of it’s kind. Please read more about the museum and the exhibition below. Help us solve the following problem we experience with our “Muscle Powered Proof of Work” installation: Me and a friend have invested a lot of time to build an installation for the Museum. It is basically a 10GHash/s miner and RapberryPi which is powered by a hand generator (Maxon DC motor with planetary gear). Here are some pictures of the installation, although not entirely put together yet. There are still some changes planned. https://www.dropbox.com/sh/0qcvl3wu4romhnt/AAAYF08lnVAy6W6KEepE7e2Ua?dl=0 Now let’s get to the core of our problem: We are mining at the getwork diff1 pool btcmp.com as it is a true pps pool with getwork diff1. The visitors in the museum can power the generator for 2-3min and see directly how many Satoshis the "network" (actually pool but we don't want to confuse the visitors to much at that point) has given the museum for their work. This all works well so far but one problem remains. Sometimes the pool does not get a share from us for more than 40 seconds or even more than 60 in some cases. I have calculated that with 8.4 GHash/s we should find a share about every 0.5 seconds in average (diff1). I think when the pool gets a share it gets all the hashes as it then accounts for several Satoshis. Statistically we get per minute what we should get in theory. We would very much like to lower the time between the accepted shares by the pool, however. This would help to make the overall experience much smoother for the visitors. Please look at this screenshot from MinePeon and answer some questions: https://www.dropbox.com/s/lb1jei4trc9kqe5/MinePeonScreenshot.png?dl=0 We see that we get a lot of diff1 hashes. However, only 11 shares/packages have been accepted. The Is there a possibility to set the miner SW so it submits to the pool as soon as a share is found? It seems to send them in packages which sometimes have 4-5 seconds in between but sometimes a much as 80 seconds. I would like to submit packages of hashes much more often. How can this be influenced? What exactly are the Getworks (GW)? What exactly are the Accepted ones (Acc)? This is where the TipBounty is. Help us to get a better Acc/diff1 ratio. Best would be 1:1. What exactly are the rejected ones (Rej)? What exactly are the discarded ones (Disc)? What exactly are the difficulty one hashes (diff1)? Now some of these questions seem very very basic but it is important for us to understand what these are and how we can influence these. We have a 1:1 correlation between the Acc and the pool side acknowledgement of shares/packages. So whenever the MinePeon shows one more for this value the pool value for last submitted share goes to “moments ago”. Does the miner SW have a setting where we can set after how many diff1 hashes a package of hashes is sent to the pool? If no, do you have another idea why so few are sent? Ideally we would set it so the diff1 hashes are sent every 5 seconds or so, probably even more often. Is stratum with fixed diff1 possible? If so, would it be better to use stratum? Are there critical settings if we should know of? (we have tried --request-diff and --no-submit-stale) We are using BFGMiner on MinePeon if that matters. We could switch to CGMiner if that would help. Any help is very much appreciated. The museum is doing a great job explaining Bitcoin basics. We had special focus on interactive learning and have several things to underline this. I hope to hear back from you so we can improve our installation. Please don't hesitate to ask if you have further questions. We are both not mining experts. Thanks for reading and AMA. SimonBelmond Current features of the Bitcoin exhibition at the Zürich MoneyMuseum: Current Features:
Life screen with various stats/charts/parameters/transactions…
Muscle powered PoW: Hand generator with 5v and 3.5-5A output, Raspberry Pi, MinePeon, 5x Antminer U2+ plus a screen to show the hash-rate at the pool and/or in MinePeon web interface. This screen will not be hand powered. This installation will complement their coining die (go to 1:27 to see what I mean).
The Bitcoin mining evolution (CPU, GPU, FPGA, ASIC)
A few short (2-3 minutes) interviews.
Other wallets, Trezor, PiperWallet
ATM Prototype, functional
PiperWallet to use.
Casascius and other physical Bitcoins, Wallets (also some commemorative coins), Paper wallet like one out of the first Bitcoin (A)TM ever
12 Picture tours
Bitcoin for beginners
Debunking 13 Bitcoin myths
What you definitely have to know
The history of Bitcoin
Bitcoin und traditional forms of money
Alternatives to Bitcoin
Citations about Bitcoin
How do I open an account?
How do I get Bitcoin?
Bitcoin community and economy
Bitcoin as a platform
I see this as a good opportunity for Bitcoin, so let’s embrace it. I am especially excited to compare the traditional forms of money which used proof of work to the new money which also uses proof of work. I think in that context it will be much easier for the visitors to value this concept. A lot of schools and other groups book guided tours at the museum. It is open on every Friday from December 05. On. Entry is free of charge. Edit:Markdown, typos
New to r/Tokenmining? click here for more in-depth info!
What is EIP:918?
EIP:918 is an Ethereum Improvement Proposal for standardizing mineable token distribution using Proof of Work. The primary driver behind the standard is to address the very broken ICO model that currently plagues the Ethereum network. Token distribution via the ICO model and it’s derivatives has always been susceptible to illicit behavior by bad actors. New token projects are centralized by nature because a single entity must handle and control all of the initial coins and all of the the raised ICO money. By distributing tokens via an alternative ‘Initial Mining Offering’ (or IMO), the ownership of the token contract no longer belongs with the deployer at all and the deployer is ‘just another user.’ As a result, investor risk exposure utilizing a mined token distribution model is significantly diminished. This standard is intended to be standalone, allowing maximum interoperability with ERC20, ERC721, and future token standards. The most effective economic side effect of Satoshi Nakamoto’s desire to secure the original Bitcoin network with Proof of Work hash mining was tethering the coin to real computing power, thereby removing centralized actors. Transitioning the responsibility of work back onto individual miners, government organizations have no jurisdiction over the operation of a pure mined token economy. Oversight is removed from an equation whereby miners are providing economic effort in direct exchange of a cryptographic commodity. This facilitates decentralized distribution and establishes all involved parties as stakeholders. The ERC918 standard allows projects to be funded through decentralized computing power instead of centralized, direct-fiat conversion. The Ethereum blockchain in its current state exists as a thriving ecosystem which allows any individual to store immutable records in a permission-less, invulnerable and transparent manner. Recently, there have been proposals to mitigate some initial ICO investment risks through the introduction of the DAICO model that relies on timed and automated value transfers via the smart contract tapping mechanism. However, this does not align a token smart contract as a non-security and still has the potential to put investors at risk if not implemented carefully, relying on centralized actors to be fair and community intended. Allowing users of the network direct access to tokens by performing computations as a proof of work supplies allows any smart contract to distribute a token in a safe and controlled manner similar to the release of a commodity. As of 2017, all Ethereum token distribution methods were flawed and susceptible to Sybil attacks. A Sybil attack is a form of computer security attack where one person pretends to be many people with multiple computer accounts in order to manipulate a system in a malicious way. ICOs and airdrops are highly susceptible to these type of attacks so there is no way to verify that all ERC20 tokens distributed by the deployer were doled out fairly or unfairly. Proof of Work distribution is resistant to Sybil attacks. This means that ERC918 tokens are among the first trustless Ethereum tokens in the world. The distribution of ERC918 tokens is fair because they are allotted via an open, decentralized mathematical algorithm (that anyone can view on the mainnet blockchain) and not a centralized human monarchy. ERC918’s first incarnation (and inspiration) was the 0xBitcoin project that launched in early 2018. Since then, several projects have realized the standard in innovative and creative ways. Catether (0xCATE) erupted early and additionally mints payback tokens during transfer operations to offset gas costs. 0xGold and 0xLitecoin each implement the first on-chain merge-mining with 0xBitcoin and the Mineable Gem project extends the standard onto a non-fungible collectible artifacts, whereby each gem has a unique mining difficulty. The Mineable project is a newer initiative that provides users with the ability to create mineable ERC20 tokens on-chain without writing a line of code and includes a virtualized hashing artifact market that allows miners to purchase on-chain vGPUs to improve mining difficulty and rewards. (written by jlogelin)
MINING IN A NUTSHELL
0xBitcoin is a Smart Contract on the Ethereum network, and the concept of Token Mining is patterned after Bitcoin's distribution. Rather than solving 'blocks', work is issued by the contract, which also maintains a Difficulty which goes up or down depending on how often a Reward is issued. Miners can put their hardware to work to claim these rewards, in concert with specialized software, working either by themselves or together as a Pool. The total lifetime supply of 0xBitcoin is 21,000,000 tokens and rewards will repeatedly halve over time. The 0xBitcoin contract was deployed by Infernal_Toast at Ethereum address: 0xb6ed7644c69416d67b522e20bc294a9a9b405b31
MINING IN MORE DETAIL (Gee-Whiz Info)
0xBitcoin's smart contract, running on the Ethereum network, maintains a changing "Challenge" (that is generated from the previous Ethereum block hash) and an adjusting Difficulty Target. Like traditional mining, the miners use the SoliditySHA3 algorithm to solve for a Nonce value that, when hashed alongside the current Challenge and their Minting Ethereum Address, is less-than-or-equal-to the current Difficulty Target. Once a miner finds a solution that satisfies the requirements, they can submit it into the contract (calling the Mint() function). This is most often done through a mining pool. The Ethereum address that submits a valid solution first is sent the 50 0xBTC Reward. (In the case of Pools, valid solutions that do not satisfy the full difficulty specified by the 0xBitcoin contract, but that DO satisfy the Pool's specified Minimum Share Difficulty, get a 'share'. When one of the Miners on that Pool finds a "Full" solution, the number of shares each miner's address has submitted is used to calculate how much of the 50 0xBTC reward they will get. After a Reward is issued, the Challenge changes.
HOW DIFFICULTY ADJUSTMENT WORKS
A Retarget happens every 1024 rewards. In short, the Contract tries to target an Average Reward Time of about 60 times the Ethereum block time. So (at the time of this writing): ~13.9 seconds \* 60 = 13.9 minutes If the average Reward Time is longer than that, the difficulty will decrease. If it's shorter, it will increase. How much longer or shorter it was affects the magnitude with which the difficulty will rise/drop, to a maximum of 50%. * Click Here to visit the stats page~ (https://0x1d00ffff.github.io/0xBTC-Stats) to see recent stats and block times, feel free to ask questions about it if you need help understanding it.
Presently, 0xBitcoin and "Alt Tokens" can be mined on GPUs, CPUs, IGPs (on-CPU graphics) and certain FPGAs. The most recommended hardware is nVidia graphics cards for their efficiency, ubiquity and relatively low cost. As general rules, the more cores and the higher core frequency (clock) you can get, the more Tokens you will earn!
Mining on nVidia cards:
Pascal (GTX 10x0) cards are usually the best choice due to their power efficiency. Maxwell-Generation 2 (GTX 9xx) cards are also a good choice and are often great overclockers, but they use more powegenerate more heat. Any fairly-recent nVidia card supporting CUDA should be capable of mining Tokens. It's possible to mine in OpenCL mode on nVidia devices, but It is preferable to use a CUDA for substantially better performance. (See Mining Software section.)
Mining on AMD cards:
AMD GPUs are quite capable of Token mining, though they can't achieve quite the same performance that nV/CUDA GPUs can at this time. Because of their typically-high memory bandwidth (especially cards with HBM/HBM2), it is possible to mine 0xBitcoin/ERC918 Tokens alongside a Video Memory-intensive algorithm like Ethash or Cryptonight! (See Mining Software section.)
Mining on IGPs (e.g. AMD Radeon and Intel HD Graphics):
This type of GPU is considerably less powerful than a discrete GPU, but is still capable of mining. They can supplement hashpower from other devices. The best performance should come from a chip with a larger number of Shader cores (like a Zen-based APU), but even typical Intel IGPs can submit shares and earn Tokens. (See Mining Software section.)
Clocks and Power Levels:
The algorithm used for 0xBitcoin and Alt-Token mining uses the faster memories in a GPU core instead of Video Memory. As a result, it is advisable to underclock the Memory, which will save a little power, reduce memory temperature and sometimes enable the GPU core to hit higher clock speeds with stability. A card's Power Limit and Core Voltage can be tweaked to attain the best efficiency for individual cards. ~Pascal cards (like GTX 10x0) are generally more temperature-sensitive when overclocked. Reducing Core temperature can often stabilize higher overclocks better than adding voltage can. Maxwell-Gen2 cards (like GTX 9xx) can usually be overclocked further at higher temperatures.
V4.x versions are a near-total 'Modern' C++ rewrite/redesign for 64-bit Windows, built for speed, ease-of-use and stability. It supports nVidia/CUDA devices and Pool Mining. Solo/CPU mining both planned. Features a fully-integrated GUI, numerous optimizations assembly functions for speed (nicknamed 'Hashburner'), and supports multiple GPUs running in a single instance since v4.1. Auto-Donation/devfee of 1.5% (default of 1.5%.) Under active development!
A fork of 0xBitcoin-Miner designed for enhanced speed and less invalid shares at the Pool level. It is somewhat older and is built using a combination of NodeJS/C++/CUDA. It has versions available for 64-bit Windows and Linux and runs from a command-line interface. Comes in multiple versions with 1, 1.5 or 2% "Auto-Donation"/devfee. Not under development at this time, but still relevant.
A Command-Line Interface miner that aims to provide functionality similar to that of "CCMiner" for other algorithms for 0xBitcoin and other ERC-918s. As such, it offers an API for integrating with Mining management software and integration with HiveOS & EthOS. It also supports OpenCL devices (such as AMD cards and Intel IGPs.) Has a minimum Auto-Donation/devfee of 1.5% (with a default of 2.0%.) Under active development!
AIOMiner is an All-In-One GPU Mining software for Windows that boasts support for over 55 different algorithms, is free to use, and eliminates the need to configure batch files through its easy to use interface.
TokenMiner is based upon Genoil Ethminer and was the first to add support for OpenCL devices (AMD GPUs/APUs.) It supports CPU and Pool/Solo mining from its command-line interface (in -C or -G, -S or -P modes.) It can also mine on nVidia/CUDA cards (in OpenCL mode, albeit with lesser performance.) Has a 1% "devfee" running in Pool Mode. This miner has since been forked for compatibility with some FPGAs!
v2.10.4 is an enhancement of the original 0xBitcoin-Miner with CUDA support added by Mikers and enhanced by Azlehria. "Nabiki" is a C++-only version, with no NodeJS code, which supports Pool Mining (just not Solo) and works on Windows 64-bit and Linux. Source code is available with pre-packaged binaries and a GUI in the works. Has a 2.5% "devfee". Under active development!
~Older Miners: Older and possibly-unsupported miner versions can be found at the above link for historical purposes and specific applications- including the original NodeJS CPU miner by Infernal Toast/Zegordo, the '1000x' NodeJS/C++ hybrid version of 0xBitcoin-Miner and Mikers' enhanced CUDA builds.
FOR MORE INFORMATION...
If you have any trouble, the friendly and helpful 0xBitcoin community will be happy to help you out. Discord has kind of become 0xBTC's community hub, you can get answers the fastest from devs and helpful community members. Or message one of the community members on reddit listed below.
https://preview.redd.it/5r9soz2ltq421.jpg?width=268&format=pjpg&auto=webp&s=6a89685f735b53ec1573eefe08c8646970de8124 What is Bitcoin? Bitcoin is an experimental system of transfer and verification of property based on a network of peer to peer without any central authority. The initial application and the main innovation of the Bitcoin network is a system of digital currency decentralized unit of account is bitcoin. Bitcoin works with software and a protocol that allows participants to issue bitcoins and manage transactions in a collective and automatic way. As a free Protocol (open source), it also allows interoperability of software and services that use it. As a currency bitcoin is both a medium of payment and a store of value. Bitcoin is designed to self-regulate. The limited inflation of the Bitcoin system is distributed homogeneously by computing the network power, and will be limited to 21 million divisible units up to the eighth decimal place. The functioning of the Exchange is secured by a general organization that everyone can examine, because everything is public: the basic protocols, cryptographic algorithms, programs making them operational, the data of accounts and discussions of the developers. The possession of bitcoins is materialized by a sequence of numbers and letters that make up a virtual key allowing the expenditure of bitcoins associated with him on the registry. A person may hold several key compiled in a 'Bitcoin Wallet ', 'Keychain' web, software or hardware which allows access to the network in order to make transactions. Key to check the balance in bitcoins and public keys to receive payments. It contains also (often encrypted way) the private key associated with the public key. These private keys must remain secret, because their owner can spend bitcoins associated with them on the register. All support (keyrings) agrees to maintain the sequence of symbols constituting your keychain: paper, USB, memory stick, etc. With appropriate software, you can manage your assets on your computer or your phone. Bitcoin on an account, to either a holder of bitcoins in has given you, for example in Exchange for property, either go through an Exchange platform that converts conventional currencies in bitcoins, is earned by participating in the operations of collective control of the currency. The sources of Bitcoin codes have been released under an open source license MIT which allows to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the software, subject to insert a copyright notice into all copies. Bitcoin creator, Satoshi Nakamoto What is the Mining of bitcoin? Technical details : During mining, your computer performs cryptographic hashes (two successive SHA256) on what is called a header block. For each new hash, mining software uses a different random number that called Nuncio. According to the content of the block and the nonce value typically used to express the current target. This number is called the difficulty of mining. The difficulty of mining is calculated by comparing how much it is difficult to generate a block compared to the first created block. This means that a difficulty of 70000 is 70000 times more effort that it took to Satoshi Nakamoto to generate the first block. Where mining was much slower and poorly optimized. The difficulty changes each 2016 blocks. The network tries to assign the difficulty in such a way that global computing power takes exactly 14 days to generate 2016 blocks. That's why the difficulty increases along with the power of the network. Material : In the beginning, mining with a processor (CPU) was the only way to undermine bitcoins. (GPU) graphics cards have possibly replaced the CPU due to their nature, which allowed an increase between 50 x to 100 x in computing power by using less electricity by megahash compared to a CPU. Although any modern GPU can be used to make the mining, the brand AMD GPU architecture has proved to be far superior to nVidia to undermine bitcoins and the ATI Radeon HD 5870 card was the most economical for a time. For a more complete list of graphics cards and their performance, see Wiki Bitcoin: comparison of mining equipment In the same way that transition CPU to GPU, the world of mining has evolved into the use of the Field Programmable Gate Arrays (FPGA) as a mining platform. Although FPGAs did not offer an increase of 50 x to 100 x speed of calculation as the transition from CPU to GPU, they offered a better energy efficiency. A typical HD/s 600 graphics card consumes about 400w of power, while a typical FPGA device can offer a rate of hash of 826 MH/s to 80w of power consumption, a gain of 5 x more calculations for the same energy power. Since energy efficiency is a key factor in the profitability of mining, it was an important step for the GPU to FPGA migration for many people. The world of the mining of bitcoin is now migrating to the Application Specific Integrated Circuit (ASIC). An ASIC is a chip designed specifically to accomplish a single task. Unlike FPGAs, an ASIC is unable to be reprogrammed for other tasks. An ASIC designed to undermine bitcoins cannot and will not do anything else than to undermine bitcoins. The stiffness of an ASIC allows us to offer an increase of 100 x computing power while reducing power consumption compared to all other technologies. For example, a classic device to offer 60 GH/s (1 hashes equals 1000 Megahash. 1GH/s = 1000 Mh/s) while consuming 60w of electricity. Compared to the GPU, it is an increase in computing power of 100 x and a reduction of power consumption by a factor of 7. Unlike the generations of technologies that have preceded the ASIC, ASIC is the "end of the line" when we talk about important technology change. The CPUs have been replaced by the GPUs, themselves replaced by FPGAs that were replaced by ASICs. There is nothing that can replace the ASICs now or in the immediate future. There will be technological refinements in ASIC products, and improvements in energy efficiency, but nothing that may match increased from 50 x to 100 x the computing power or a 7 x reduction in power consumption compared with the previous technology. Which means that the energy efficiency of an ASIC device is the only important factor of all product ASIC, since the estimated lifetime of an ASIC device is superior to the entire history of the mining of bitcoin. It is conceivable that a purchased ASIC device today is still in operation in two years if the unit still offers a profitable enough economic to keep power consumption. The profitability of mining is also determined by the value of bitcoin but in all cases, more a device has a good energy efficiency, it is profitable. Software : There are two ways to make mining: by yourself or as part of a team (a pool). If you are mining for yourself, you must install the Bitcoin software and configure it to JSON-RPC (see: run Bitcoin). The other option is to join a pool. There are multiple available pools. With a pool, the profit generated by any block generated by a member of the team is split between all members of the team. The advantage of joining a team is to increase the frequency and stability of earnings (this is called reduce the variance) but gains will be lower. In the end, you will earn the same amount with the two approaches. Undermine solo allows you to receive earnings huge but very infrequent, while miner with a pool can offer you small stable and steady gains. Once you have your software configured or that you have joined a pool, the next step is to configure the mining software. The software the most populare for ASIC/FPGA/GPU currently is CGminer or a derivative designed specifically for FPGAS and ASICs, BFGMiner. If you want a quick overview of mining without install any software, try Bitcoin Plus, a Bitcoin minor running in your browser with your CPU. It is not profitable to make serious mining, but it is a good demonstration of the principle of the mining team.
HPB is a new blockchain architecture, positioned as an easy-to-use, high-performance blockchain platform. It aims to extend the performance of distributed applications to meet real-world business needs. TPS Performance • Transaction volume and applications are increasing rapidly on current blockchains, making network congestion a major issue • Bitcoin, a legacy network, is confined to about 7 TPS. Compared to Visa with an average TPS of 2,000 and a maximum TPS of 50,000, it is extremely slow • Even Lightning network aims for just 1,000 TPS, far below existing industry standards Poor Security • Blockchains such as Ethereum have experienced security breaches • Community splits can lead to multiple hard forks, leading to an increasingly splintered blockchain network • There is still no real solution to a 51% network attack, where a majority of mining resources are turned against the network High Transaction Fees • Congestion in blockchain networks, especially during high-load times and major events, leads to higher fees • Block size can also lead to network latency and excessive confirmation times, resulting in more expensive transactions Difficulty in Developing DAPPs • Due to the above problems, developers have issues creating optimized DAPPs for existing blockchain networks - 5 - The software architecture provides accounts, identity and authorization management, policy management, databases, and asynchronous communication on thousands of CPUs, FPGAs or clustered program schedules. This is a new architecture that can support millions of transactions per second and support authorizations within seconds. Web: http://www.gxn.io/
Hello Bitcoin! I have been following the community from the sidelines for a few years now. I've read everything I can, picked through the source code and papers and learned a lot of crypto in the process. It has been difficult at times. I thought I had "missed the boat" when the bitcoin exchange rate was skyrocketing. I was disheartened when the value crashed and it was dismissed by my friends. I was anxious when my GPU mining was eclipsed by dedicated rigs and FPGAs. Still, Bitcoin has pushed through it all and come out stronger for the wear. There is enough evidence to convince me that the benefits of cryptocurrency are so numerous as to make its adoption inevitable. One day, we will wonder how anyone could have doubted the effect a purely digital currency would have, just like how the benefits of internet itself seem so obvious to us now. Yes, the value will fluctuate, but I firmly believe that cryptocurrency won't just disappear overnight with so many people collectively working to strengthen and grow the system. There are still challenges ahead. One that has had a lot of attention recently is that bitcoins only have value if they are being used. Otherwise, the value is just based on speculation, and we all know what problems that has caused.' Bitcoin Friday is a great idea, and projects like the BitInstant Paycard could certainly help with this in the future, but there is centralization, fees and a lot of trust involved here. I don't think this meshes with what should be a truly p2p system: we need to be able to use bitcoins as money without having to trust anyone. Today, I'd like to start using Amazon Gift Cards to help fill the gap. This is what I propose: I will gift brand-new Amazon Digital Gift Cards to you in exchange for your bitcoins. This allows anyone to turn bitcoins into goods/services, but we cut out speculators that trade in the currency and don't actually provide value or grow the market. You have to actually be buying something for this to be useful to you, which is exactly the point. I hope that by posting with my real account and name, you will have confidence that my intentions are good and I will not scam anyone. Still, the whole point of Bitcoin is to minimize trust, so this is what I propose: The exchange rate is pegged to the median value of the three USD market prices on the front page of bitcoincharts.com. For simplicity, I'll peg the rate today at 0.4371 BTC = $5 USD
PM or post:
total US dollar value of Amazon Gift Cards you wish to purchase
the email address you want them sent to
I will purchase a $5 USD gift card and send it to you. I am going out on a limb here to show my good intentions, and I will trust you first. In the gift note, I'll include the destination address for your 0.4371 bitcoins.
You fund the transaction by sending bitcoins to the destination address.
I monitor the destination address. As you deposit bitcoins, I will purchase gift cards for you, up to the amount we agreed on. We can do this $1 at a time, $5 at a time, or $50, your choice. Amazon sets the minimum amount at $0.15, so I can't send cards smaller than that.
Comments? Questions? Let me know!! Edit 1: Thanks to everyone who's contacted me so far! Edit 2: Well this has been fun, but I'm signing off for the night. I'll leave this post as a standing offer until otherwise noted.
https://seekingalpha.com/article/4152240-amds-growing-cpu-advantage-intel?page=1 AMD's Growing CPU Advantage Over Intel Mar. 1.18 | About: Advanced Micro (AMD) Raymond Caron, Ph.D. Tech, solar, natural resources, energy (315 followers) Summary AMD's past and economic hazards. AMD's Current market conditions. AMD Zen CPU advantage over Intel. AMD is primarily a CPU fabrication company with much experience and a great history in that respect. They hold patents for 64-bit processing, as well as ARM based processing patents, and GPU architecture patents. AMD built a name for itself in the mid-to-late 90’s when they introduced the K-series CPU’s to good reviews followed by the Athlon series in ‘99. AMD was profitable, they bought the companies NexGen, Alchemy Semiconductor, and ATI. Past Economic Hazards If AMD has such a great history, then what happened? Before I go over the technical advantage that AMD has over Intel, it’s worth looking to see how AMD failed in the past, and to see if those hazards still present a risk to AMD. As for investment purposes we’re more interested in AMD’s turning a profit. AMD suffered from intermittent CPU fabrication problems, and was also the victim of sustained anti-competitive behaviour from Intel who interfered with AMD’s attempts to sell its CPU’s to the market through Sony, Hitachi, Toshiba, Fujitsu, NEC, Dell, Gateway, HP, Acer, and Lenovo. Intel was investigated and/or fined by multiple countries including Japan, Korea, USA, and EU. These hazard needs to be examined to see if history will repeat itself. There have been some rather large changes in the market since then. 1) The EU has shown they are not averse to leveling large fines, and Intel is still fighting the guilty verdict from the last EU fine levied against them; they’ve already lost one appeal. It’s conceivable to expect that the EU, and other countries, would prosecute Intel again. This is compounded by the recent security problems with Intel CPU’s and the fact that Intel sold these CPU’s under false advertising as secure when Intel knew they were not. Here are some of the largest fines dished out by the EU 2) The Internet has evolved from Web 1.0 to 2.0. Consumers are increasing their online presence each year. This reduces the clout that Intel can wield over the market as AMD can more easily sell to consumers through smaller Internet based companies. 3) Traditional distributors (HP, Dell, Lenovo, etc.) are struggling. All of these companies have had recent issues with declining revenue due to Internet competition, and ARM competition. These companies are struggling for sales and this reduces the clout that Intel has over them, as Intel is no longer able to ensure their future. It no longer pays to be in the club. These points are summarized in the graph below, from Statista, which shows “ODM Direct” sales and “other sales” increasing their market share from 2009 to Q3 2017. 4) AMD spun off Global Foundries as a separate company. AMD has a fabrication agreement with Global Foundries, but is also free to fabricate at another foundry such as TSMC, where AMD has recently announced they will be printing Vega at 7nm. 5) Global Foundries developed the capability to fabricate at 16nm, 14nm, and 12nm alongside Samsung, and IBM, and bought the process from IBM to fabricate at 7nm. These three companies have been cooperating to develop new fabrication nodes. 6) The computer market has grown much larger since the mid-90’s – 2006 when AMD last had a significant tangible advantage over Intel, as computer sales rose steadily until 2011 before starting a slow decline, see Statista graph below. The decline corresponds directly to the loss of competition in the marketplace between AMD and Intel, when AMD released the Bulldozer CPU in 2011. Tablets also became available starting in 2010 and contributed to the fall in computer sales which started falling in 2012. It’s important to note that computer shipments did not fall in 2017, they remained static, and AMD’s GPU market share rose in Q4 2017 at the expense of Nvidia and Intel. 7) In terms of fabrication, AMD has access to 7nm on Global Foundries as well as through TSMC. It’s unlikely that AMD will experience CPU fabrication problems in the future. This is something of a reversal of fortunes as Intel is now experiencing issues with its 10nm fabrication facilities which are behind schedule by more than 2 years, and maybe longer. It would be costly for Intel to use another foundry to print their CPU’s due to the overhead that their current foundries have on their bottom line. If Intel is unable to get the 10nm process working, they’re going to have difficulty competing with AMD. AMD: Current market conditions In 2011 AMD released its Bulldozer line of CPU’s to poor reviews and was relegated to selling on the discount market where sales margins are low. Since that time AMD’s profits have been largely determined by the performance of its GPU and Semi-Custom business. Analysts have become accustomed to looking at AMD’s revenue from a GPU perspective, which isn’t currently being seen in a positive light due to the relation between AMD GPU’s and cryptocurrency mining. The market views cryptocurrency as further risk to AMD. When Bitcoin was introduced it was also mined with GPU’s. When the currency switched to ASIC circuits (a basic inexpensive and simple circuit) for increased profitability (ASIC’s are cheaper because they’re simple), the GPU’s purchased for mining were resold on the market and ended up competing with and hurting new AMD GPU sales. There is also perceived risk to AMD from Nvidia which has favorable reviews for its Pascal GPU offerings. While AMD has been selling GPU’s they haven’t increased GPU supply due to cryptocurrency demand, while Nvidia has. This resulted in a very high cost for AMD GPU’s relative to Nvidia’s. There are strategic reasons for AMD’s current position: 1) While the AMD GPU’s are profitable and greatly desired for cryptocurrency mining, AMD’s market access is through 3rd party resellers whom enjoy the revenue from marked-up GPU sales. AMD most likely makes lower margins on GPU sales relative to the Zen CPU sales due to higher fabrication costs associated with the fabrication of larger size dies and the corresponding lower yield. For reference I’ve included the size of AMD’s and Nvidia’s GPU’s as well as AMD’s Ryzen CPU and Intel’s Coffee lake 8th generation CPU. This suggests that if AMD had to pick and choose between products, they’d focus on Zen due higher yield and revenue from sales and an increase in margin. 2) If AMD maintained historical levels of GPU production in the face of cryptocurrency demand, while increasing production for Zen products, they would maximize potential income for highest margin products (EPYC), while reducing future vulnerability to second-hand GPU sales being resold on the market. 3) AMD was burned in the past from second hand GPU’s and want to avoid repeating that experience. AMD stated several times that the cryptocurrency boom was not factored into forward looking statements, meaning they haven’t produced more GPU’s to expect more GPU sales. In contrast, Nvidia increased its production of GPU’s due to cryptocurrency demand, as AMD did in the past. Since their Pascal GPU has entered its 2nd year on the market and is capable of running video games for years to come (1080p and 4k gaming), Nvidia will be entering a position where they will be competing directly with older GPU’s used for mining, that are as capable as the cards Nvidia is currently selling. Second-hand GPU’s from mining are known to function very well, with only a need to replace the fan. This is because semiconductors work best in a steady state, as opposed to being turned on and off, so it will endure less wear when used 24/7. The market is also pessimistic regarding AMD’s P/E ratio. The market is accustomed to evaluating stocks using the P/E ratio. This statistical test is not actually accurate in evaluating new companies, or companies going into or coming out of bankruptcy. It is more accurate in evaluating companies that have a consistent business operating trend over time. “Similarly, a company with very low earnings now may command a very high P/E ratio even though it isn’t necessarily overvalued. The company may have just IPO’d and growth expectations are very high, or expectations remain high since the company dominates the technology in its space.” P/E Ratio: Problems With The P/E I regard the pessimism surrounding AMD stock due to GPU’s and past history as a positive trait, because the threat is minor. While AMD is experiencing competitive problems with its GPU’s in gaming AMD holds an advantage in Blockchain processing which stands to be a larger and more lucrative market. I also believe that AMD’s progress with Zen, particularly with EPYC and the recent Meltdown related security and performance issues with all Intel CPU offerings far outweigh any GPU turbulence. This turns the pessimism surrounding AMD regarding its GPU’s into a stock benefit. 1) A pessimistic group prevents the stock from becoming a bubble. -It provides a counter argument against hype relating to product launches that are not proven by earnings. Which is unfortunately a historical trend for AMD as they have had difficulty selling server CPU’s, and consumer CPU’s in the past due to market interference by Intel. 2) It creates predictable daily, weekly, monthly, quarterly fluctuations in the stock price that can be used, to generate income. 3) Due to recent product launches and market conditions (Zen architecture advantage, 12nm node launching, Meltdown performance flaw affecting all Intel CPU’s, Intel’s problems with 10nm) and the fact that AMD is once again selling a competitive product, AMD is making more money each quarter. Therefore the base price of AMD’s stock will rise with earnings, as we’re seeing. This is also a form of investment security, where perceived losses are returned over time, due to a stock that is in a long-term upward trajectory due to new products reaching a responsive market. 4) AMD remains a cheap stock. While it’s volatile it’s stuck in a long-term upward trend due to market conditions and new product launches. An investor can buy more stock (with a limited budget) to maximize earnings. This is advantage also means that the stock is more easily manipulated, as seen during the Q3 2017 ER. 5) The pessimism is unfounded. The cryptocurrency craze hasn’t died, it increased – fell – and recovered. The second hand market did not see an influx of mining GPU’s as mining remains profitable. 6) Blockchain is an emerging market, that will eclipse the gaming market in size due to the wide breath of applications across various industries. Vega is a highly desired product for Blockchain applications as AMD has retained a processing and performance advantage over Nvidia. There are more and rapidly growing applications for Blockchain every day, all (or most) of which will require GPU’s. For instance Microsoft, The Golem supercomputer, IBM, HP, Oracle, Red Hat, and others. Long-term upwards trend AMD is at the beginning of a long-term upward trend supported by a comprehensive and competitive product portfolio that is still being delivered to the market, AMD referred to this as product ramping. AMD’s most effective products with Zen is EPYC, and the Raven Ridge APU. EPYC entered the market in mid-December and was completely sold out by mid-January, but has since been restocked. Intel remains uncompetitive in that industry as their CPU offerings are retarded by a 40% performance flaw due to Meltdown patches. Server CPU sales command the highest margins for both Intel and AMD. The AMD Raven Ridge APU was recently released to excellent reviews. The APU is significant due to high GPU prices driven buy cryptocurrency, and the fact that the APU is a CPU/GPU hybrid which has the performance to play games available today at 1080p. The APU also supports the Vulcan API, which can call upon multiple GPU’s to increase performance, so a system can be upgraded with an AMD or Nvidia GPU that supports Vulcan API at a later date for increased performance for those games or workloads that been programmed to support it. Or the APU can be replaced when the prices of GPU’s fall. AMD also stands to benefit as Intel confirmed that their new 10 nm fabrication node is behind in technical capability relative to the Samsung, TSMC, and Global Foundries 7 nm fabrication process. This brings into questions Intel’s competitiveness in 2019 and beyond. Take-Away • AMD was uncompetitive with respect to CPU’s from 2011 to 2017 • When AMD was competitive, from 1996 to 2011 they did record profit and bought 3 companies including ATI. • AMD CPU business suffered from: • Market manipulation from Intel. • Intel fined by EU, Japan, Korea, and settled with the USA • Foundry productivity and upgrade complications • AMD has changed • Global Foundries spun off as an independent business • Has developed 14nm &12nm, and is implementing 7nm fabrication • Intel late on 10nm, is less competitive than 7nm node • AMD to fabricate products using multiple foundries (TSMC, Global Foundries) • The market has changed • More AMD products are available on the Internet and both the adoption of the Internet and the size of the Internet retail market has exploded, thanks to the success of smartphones and tablets. • Consumer habits have changed, more people shop online each year. Traditional retailers have lost market share. • Computer market is larger (on-average), but has been declining. While Computer shipments declined in Q2 and Q3 2017, AMD sold more CPU’s. • AMD was uncompetitive with respect to CPU’s from 2011 to 2017. • Analysts look to GPU and Semi-Custom sales for revenue. • Cryptocurrency boom intensified, no crash occurred. • AMD did not increase GPU production to meet cryptocurrency demand. • Blockchain represents a new growth potential for AMD GPU’s. • Pessimism acts as security against a stock bubble & corresponding bust. • Creates cyclical volatility in the stock that can be used to generate profit. • P/E ratio is misleading when used to evaluate AMD. • AMD has long-term growth potential. • 2017 AMD releases competitive product portfolio. • Since Zen was released in March 2017 AMD has beat ER expectations. • AMD returns to profitability in 2017. • AMD taking measureable market share from Intel in OEM CPU Desktop and in CPU market. • High margin server product EPYC released in December 2017 before worst ever CPU security bug found in Intel CPU’s that are hit with detrimental 40% performance patch. • Ryzen APU (Raven Ridge) announced in February 2018, to meet gaming GPU shortage created by high GPU demand for cryptocurrency mining. • Blockchain is a long-term growth opportunity for AMD. • Intel is behind the competition for the next CPU fabrication node. AMD’s growing CPU advantage over Intel About AMD’s Zen Zen is a technical breakthrough in CPU architecture because it’s a modular design and because it is a small CPU while providing similar or better performance than the Intel competition. Since Zen was released in March 2017, we’ve seen AMD go from 18% CPU market share in the OEM consumer desktops to essentially 50% market share, this was also supported by comments from Lisa Su during the Q3 2017 ER call, by MindFactory.de, and by Amazon sales of CPU’s. We also saw AMD increase its market share of total desktop CPU’s. We also started seeing market share flux between AMD and Intel as new CPU’s are released. Zen is a technical breakthrough supported by a few general guidelines relating to electronics. This provides AMD with an across the board CPU market advantage over Intel for every CPU market addressed. 1) The larger the CPU the lower the yield. - Zen architecture that makes up Ryzen, Threadripper, and EPYC is smaller (44 mm2 compared to 151 mm2 for Coffee Lake). A larger CPU means fewer CPU’s made during fabrication per wafer. AMD will have roughly 3x the fabrication yield for each Zen printed compared to each Coffee Lake printed, therefore each CPU has a much lower cost of manufacturing. 2) The larger the CPU the harder it is to fabricate without errors. - The chance that a CPU will be perfectly fabricated falls exponentially with increasing surface area. Intel will have fewer high quality CPU’s printed compared to AMD. This means that AMD will make a higher margin on each CPU sold. AMD’s supply of perfect printed Ryzen’s (1800X) are so high that the company had to give them away at a reduced cost in order to meet supply demands for the cheaper Ryzen 5 1600X. If you bought a 1600X in August/September, you probably ended up with an 1800X. 3) Larger CPU’s are harder to fabricate without errors on smaller nodes. -The technical capability to fabricate CPU’s at smaller nodes becomes more difficult due to the higher precision that is required to fabricate at a smaller node, and due to the corresponding increase in errors. “A second reason for the slowdown is that it’s simply getting harder to design, inspect and test chips at advanced nodes. Physical effects such as heat, electrostatic discharge and electromagnetic interference are more pronounced at 7nm than at 28nm. It also takes more power to drive signals through skinny wires, and circuits are more sensitive to test and inspection, as well as to thermal migration across a chip. All of that needs to be accounted for and simulated using multi-physics simulation, emulation and prototyping.“ Is 7nm The Last Major Node? “Simply put, the first generation of 10nm requires small processors to ensure high yields. Intel seems to be putting the smaller die sizes (i.e. anything under 15W for a laptop) into the 10nm Cannon Lake bucket, while the larger 35W+ chips will be on 14++ Coffee Lake, a tried and tested sub-node for larger CPUs. While the desktop sits on 14++ for a bit longer, it gives time for Intel to further develop their 10nm fabrication abilities, leading to their 10+ process for larger chips by working their other large chip segments (FPGA, MIC) first.” There are plenty of steps where errors can be created within a fabricated CPU. This is most likely the culprit behind Intel’s inability to launch its 10nm fabrication process. They’re simply unable to print such a large CPU on such a small node with high enough yields to make the process competitive. Intel thought they were ahead of the competition with respect to printing large CPU’s on a small node, until AMD avoided the issue completely by designing a smaller modular CPU. Intel avoided any mention of its 10nm node during its Q4 2017 ER, which I interpret as bad news for Intel shareholders. If you have nothing good to say, then you don’t say anything. Intel having nothing to say about something that is fundamentally critical to its success as a company can’t be good. Intel is on track however to deliver hybrid CPU’s where some small components are printed on 10nm. It’s recently also come to light that Intel’s 10nm node is less competitive than the Global Foundries, Samsung, and TSMC 7nm nodes, which means that Intel is now firmly behind in CPU fabrication. 4) AMD Zen is a new architecture built from the ground up. Intel’s CPU’s are built on-top of older architecture developed with 30-yr old strategies, some of which we’ve recently discovered are flawed. This resulted in the Meltdown flaw, the Spectre flaws, and also includes the ME, and AMT bugs in Intel CPU’s. While AMD is still affected by Spectre, AMD has only ever acknowledged that they’re completely susceptible to Spectre 1, as AMD considers Spectre 2 to be difficult to exploit on an AMD Zen CPU. “It is much more difficult on all AMD CPUs, because BTB entries are not aliased - the attacker must know (and be able to execute arbitrary code at) the exact address of the targeted branch instruction.” Technical Analysis of Spectre & Meltdown * Amd Further reading Spectre and Meltdown: Linux creator Linus Torvalds criticises Intel's 'garbage' patches | ZDNet FYI: Processor bugs are everywhere - just ask Intel and AMD Meltdown and Spectre: Good news for AMD users, (more) bad news for Intel Cybersecurity agency: The only sure defense against huge chip flaw is a new chip Kernel-memory-leaking Intel processor design flaw forces Linux, Windows redesign Take-Away • AMD Zen enjoys a CPU fabrication yield advantage over Intel • AMD Zen enjoys higher yield of high quality CPU’s • Intel’s CPU’s are affected with 40% performance drop due to Meltdown flaw that affect server CPU sales. AMD stock drivers 1) EPYC • -A critically acclaimed CPU that is sold at a discount compared to Intel. • -Is not affected by 40% software slow-downs due to Meltdown. 2) Raven Ridge desktop APU • - Targets unfed GPU market which has been stifled due to cryptocurrency demand - Customers can upgrade to a new CPU or add a GPU at a later date without changing the motherboard. • - AM4 motherboard supported until 2020. 3) Vega GPU sales to Intel for 8th generation CPU’s with integrated graphics. • - AMD gains access to the complete desktop and mobile market through Intel. 4) Mobile Ryzen APU sales • -Providing gaming capability in a compact power envelope. 5) Ryzen and Threadripper sales • -Fabricated on 12nm in April. • -May eliminate Intel’s last remaining CPU advantage in IPC single core processing. • -AM4 motherboard supported until 2020. • -7nm Ryzen on track for early 2019. 6) Others: Vega, Polaris, Semi-custom, etc. • -I consider any positive developments here to be gravy. Conclusion While in the past Intel interfered with AMD's ability to bring it's products to market, the market has changed. The internet has grown significantly and is now a large market that dominates when in computer sales. It's questionable if Intel still has the influence to affect this new market, and doing so would most certainly result in fines and further bad press. AMD's foundry problems were turned into an advantage over Intel. AMD's more recent past was heavily influenced by the failure of the Bulldozer line of CPU's that dragged on AMD's bottom line from 2011 to 2017. AMD's Zen line of CPU's is a breakthrough that exploits an alternative, superior strategy, in chip design which results in a smaller CPU. A smaller CPU enjoys compounded yield and quality advantages over Intel's CPU architecture. Intel's lead in CPU performance will at the very least be challenged and will more likely come to an end in 2018, until they release a redesigned CPU. I previously targeted AMD to be worth $20 by the end of Q4 2017 ER. This was based on the speed that Intel was able to get products to market, in comparison AMD is much slower. I believe the stock should be there, but the GPU related story was prominent due to cryptocurrency craze. Financial analysts need more time to catch on to what’s happening with AMD, they need an ER that is driven by CPU sales. I believe that the Q1 2018 is the ER to do that. AMD had EPYC stock in stores when the Meltdown and Spectre flaws hit the news. These CPU’s were sold out by mid-January and are large margin sales. There are many variables at play within the market, however barring any disruptions I’d expect that AMD will be worth $20 at some point in 2018 due these market drivers. If AMD sold enough EPYC CPU’s due to Intel’s ongoing CPU security problems, then it may occur following the ER in Q1 2018. However, if anything is customary with AMD, it’s that these things always take longer than expected.
FPGA. The subsequent a part of Bitcoin mining enchancment ended up being the event of FPGA (Self-discipline Programmable Gate Array) mining. ... Some professionals take into accounts ASIC to hold out as ‘end-of-the-line’ know-how, as there’s nothing to change it contained within the instantaneous future. Estimates are $1 per BitCoin for electricity (GPU mining). Current BTC price is around $6-7 each. It is all about the ROI window. If BTC drops to below $1, then FPGA mining would rule the day. But, do you see any reason for BitCoins to fall in value any time soon? FPGA stands for Field-Programmable Gate Array. These devices were very popular among users that did not want to keep mining in the competitive landscape of GPU mining activities . Those devices have been designed in a way that users can configure their integrated circuits once the manufacturing process is completed. Almost all the bitcoin ASIC chips generally can only be used for bitcoin mining and are tailored for the bitcoin mining algorithm (SHA-256). Other non-bitcoin specific ASIC chips are used for the mining of different cryptocurrencies based on their various mining algorithms such as Ethereum (Ethash algorithm), Litecoin (Scrypt) and Dash (X11). In this work we discuss the process of Bitcoin mining and present a new Bitcoin miner implemented on an FPGA. We start by providing a brief introduction of Bitcoin network on Section 2. Next, we present some details of the Bitcoin mining process on Section 3. Then we outline our FPGA implementation of a Bitcoin miner on Section 4.
Using FPGA to mine on Auto Exchange Pools for Bitcoin
This short video by Whitefire990 demonstrates an FPGA mining rig consisting of 8 Xilinx VCU1525 FPGA cards. The cards are running freely available software a... Unboxing of a Cairnsmore1 Quad XC6SLX150 Board developed and built by enterpoint/UK. Features FPGAs Quad version - 4 x Spartan(TM)-6 XC6SLX150 FPGAs wired li... I decided to mine Phi2 algo on a pool that sells the coins for Bitcoin, using pools like this usually result in a faster ROI as you are mining the most profitable coin with others and selling your ... I picked up a few PCI FPGA Cards on eBay for 99p which, apparently, can mine BitCoins at a speed of 21 Ghash/s (once they're correctly configured!) Do you guys think FGPA's will one day take over GPU Mining? If certain coins do not change algorithms, we will see the dominance of FPGA. Today we take a look at a few coins that potentially have ...