What is the True Value of Bitcoin?

Covid has little to do with a bad virus, and everything to do with restructuring the financial system

The IMF is running its annual meetings in Andorra at the moment.
The director of the IMF said on Thursday last week :
> Today we face a new Bretton Woods “moment.”
Now, what were the Bretton Woods agreements about ?. These were about setting up a new system under which gold was the basis for the U.S. dollar and other currencies were pegged to the U.S. dollar’s value. The Bretton Woods Agreement also created two important organizations—the International Monetary Fund (IMF) and the World Bank.
What could a new Bretton Woods moment mean in this context ? It means they are restructuring the current monetary system. Under the new system, the USD is replaced by a digital currency.
A central bank-supported digital currency could replace the dollar as the global hedge currency, said Bank of England governor Mark Carney
Carney highlighted the dollar’s use in international securities issuance, its use as the primary settlement currency for international trades and the fact that companies use dollars as examples of its dominance. However, “developments in the U.S. economy, by affecting the dollar exchange rate, can have large spillover effects to the rest of the world.”
Fed Chair Jerome Powell noted he did not believe private sector involvement in the production of U.S. dollars would be trusted by the citizens. “I do think this is something that the central banks have to design,” Powell said. “The private sector is not involved in creating the money supply, that’s something the central bank does.”
As if it was not obvious, central banks don't want a bitcoin/dogecoin/monero/pokemoncoin, etc... currency. They want to fully control the new digital currency, like they control current fiat currencies.
Back to the IMF director's speech, she states 3 imperatives moving forward : the first 2 are about economic policies, and the 3rd one is about climate change.
Just as the pandemic has shown that we can no longer ignore health precautions, we can no longer afford to ignore climate change—my third imperative.
That 3rd one is surprising. What does climate change has to do with the IMF and the definition of a new monetary system ?
Here is a very interesting article about how this all relates to bill gates' mass vaccination agenda.
In an article published by ID2020 in 2018, vaccines are the perfect way to introduce digital identity to the world – especially infants. This identity would also be used to grant access to basic rights and services.
Your new digital ID will then be matched with your new digital currency issued by your central bank. They will have the absolute, uncontested right to decide whether you can have access to basic rights and services, or not. It will only take a click on the mouse to deny your access to basic rights and services. And you won't know the reason. It could be for wrong thinking, it could be to pursue another political agenda to eliminate whichever community they decided they need to eliminate. We have seen plenty of evidence this year about the strong political bias that big social media platforms have. Now, with the constant monitoring and analyzing of our data, they can easily tell what are our political opinions. And therefore have your access to basic rights and services denied with a click, if you have the 'wrong' political opinions. And I don't see why they would not do that. In a very close future, you could end up in a situation where you have to choose between being allowed to eat, or vote for the candidate you don't like, but that the system endorses. It's literally the end of democracy, and freedom, and there is no going back once we have switched to this new system.
All the above is not even a conspiracy. It's merely about connecting the dots, and understanding the implications.
edit: here is a video of Accenture, one of the founding partners of id2020, explaining about the digital dollar
I think covid was a catalyst to bring all these changes. Who else than the international financial system has the ability to have all countries on the planet to comply with such severe restriction rules that send their respective economies and societies down the toilet ?
submitted by TechnicalBody to conspiracy [link] [comments]

Proposal: The Sia Foundation

Vision Statement

A common sentiment is brewing online; a shared desire for the internet that might have been. After decades of corporate encroachment, you don't need to be a power user to realize that something has gone very wrong.
In the early days of the internet, the future was bright. In that future, when you sent an instant message, it traveled directly to the recipient. When you needed to pay a friend, you announced a transfer of value to their public key. When an app was missing a feature you wanted, you opened up the source code and implemented it. When you took a picture on your phone, it was immediately encrypted and backed up to storage that you controlled. In that future, people would laugh at the idea of having to authenticate themselves to some corporation before doing these things.
What did we get instead? Rather than a network of human-sized communities, we have a handful of enormous commons, each controlled by a faceless corporate entity. Hey user, want to send a message? You can, but we'll store a copy of it indefinitely, unencrypted, for our preference-learning algorithms to pore over; how else could we slap targeted ads on every piece of content you see? Want to pay a friend? You can—in our Monopoly money. Want a new feature? Submit a request to our Support Center and we'll totally maybe think about it. Want to backup a photo? You can—inside our walled garden, which only we (and the NSA, of course) can access. Just be careful what you share, because merely locking you out of your account and deleting all your data is far from the worst thing we could do.
You rationalize this: "MEGACORP would never do such a thing; it would be bad for business." But we all know, at some level, that this state of affairs, this inversion of power, is not merely "unfortunate" or "suboptimal" – No. It is degrading. Even if MEGACORP were purely benevolent, it is degrading that we must ask its permission to talk to our friends; that we must rely on it to safeguard our treasured memories; that our digital lives are completely beholden to those who seek only to extract value from us.
At the root of this issue is the centralization of data. MEGACORP can surveil you—because your emails and video chats flow through their servers. And MEGACORP can control you—because they hold your data hostage. But centralization is a solution to a technical problem: How can we make the user's data accessible from anywhere in the world, on any device? For a long time, no alternative solution to this problem was forthcoming.
Today, thanks to a confluence of established techniques and recent innovations, we have solved the accessibility problem without resorting to centralization. Hashing, encryption, and erasure encoding got us most of the way, but one barrier remained: incentives. How do you incentivize an anonymous stranger to store your data? Earlier protocols like BitTorrent worked around this limitation by relying on altruism, tit-for-tat requirements, or "points" – in other words, nothing you could pay your electric bill with. Finally, in 2009, a solution appeared: Bitcoin. Not long after, Sia was born.
Cryptography has unleashed the latent power of the internet by enabling interactions between mutually-distrustful parties. Sia harnesses this power to turn the cloud storage market into a proper marketplace, where buyers and sellers can transact directly, with no intermediaries, anywhere in the world. No more silos or walled gardens: your data is encrypted, so it can't be spied on, and it's stored on many servers, so no single entity can hold it hostage. Thanks to projects like Sia, the internet is being re-decentralized.
Sia began its life as a startup, which means it has always been subjected to two competing forces: the ideals of its founders, and the profit motive inherent to all businesses. Its founders have taken great pains to never compromise on the former, but this often threatened the company's financial viability. With the establishment of the Sia Foundation, this tension is resolved. The Foundation, freed of the obligation to generate profit, is a pure embodiment of the ideals from which Sia originally sprung.
The goals and responsibilities of the Foundation are numerous: to maintain core Sia protocols and consensus code; to support developers building on top of Sia and its protocols; to promote Sia and facilitate partnerships in other spheres and communities; to ensure that users can easily acquire and safely store siacoins; to develop network scalability solutions; to implement hardforks and lead the community through them; and much more. In a broader sense, its mission is to commoditize data storage, making it cheap, ubiquitous, and accessible to all, without compromising privacy or performance.
Sia is a perfect example of how we can achieve better living through cryptography. We now begin a new chapter in Sia's history. May our stewardship lead it into a bright future.
 

Overview

Today, we are proposing the creation of the Sia Foundation: a new non-profit entity that builds and supports distributed cloud storage infrastructure, with a specific focus on the Sia storage platform. What follows is an informal overview of the Sia Foundation, covering two major topics: how the Foundation will be funded, and what its funds will be used for.

Organizational Structure

The Sia Foundation will be structured as a non-profit entity incorporated in the United States, likely a 501(c)(3) organization or similar. The actions of the Foundation will be constrained by its charter, which formalizes the specific obligations and overall mission outlined in this document. The charter will be updated on an annual basis to reflect the current goals of the Sia community.
The organization will be operated by a board of directors, initially comprising Luke Champine as President and Eddie Wang as Chairman. Luke Champine will be leaving his position at Nebulous to work at the Foundation full-time, and will seek to divest his shares of Nebulous stock along with other potential conflicts of interest. Neither Luke nor Eddie personally own any siafunds or significant quantities of siacoin.

Funding

The primary source of funding for the Foundation will come from a new block subsidy. Following a hardfork, 30 KS per block will be allocated to the "Foundation Fund," continuing in perpetuity. The existing 30 KS per block miner reward is not affected. Additionally, one year's worth of block subsidies (approximately 1.57 GS) will be allocated to the Fund immediately upon activation of the hardfork.
As detailed below, the Foundation will provably burn any coins that it cannot meaningfully spend. As such, the 30 KS subsidy should be viewed as a maximum. This allows the Foundation to grow alongside Sia without requiring additional hardforks.
The Foundation will not be funded to any degree by the possession or sale of siafunds. Siafunds were originally introduced as a means of incentivizing growth, and we still believe in their effectiveness: a siafund holder wants to increase the amount of storage on Sia as much as possible. While the Foundation obviously wants Sia to succeed, its driving force should be its charter. Deriving significant revenue from siafunds would jeopardize the Foundation's impartiality and focus. Ultimately, we want the Foundation to act in the best interests of Sia, not in growing its own budget.

Responsibilities

The Foundation inherits a great number of responsibilities from Nebulous. Each quarter, the Foundation will publish the progress it has made over the past quarter, and list the responsibilities it intends to prioritize over the coming quarter. This will be accompanied by a financial report, detailing each area of expenditure over the past quarter, and forecasting expenditures for the coming quarter. Below, we summarize some of the myriad responsibilities towards which the Foundation is expected to allocate its resources.

Maintain and enhance core Sia software

Arguably, this is the most important responsibility of the Foundation. At the heart of Sia is its consensus algorithm: regardless of other differences, all Sia software must agree upon the content and rules of the blockchain. It is therefore crucial that the algorithm be stewarded by an entity that is accountable to the community, transparent in its decision-making, and has no profit motive or other conflicts of interest.
Accordingly, Sia’s consensus functionality will no longer be directly maintained by Nebulous. Instead, the Foundation will release and maintain an implementation of a "minimal Sia full node," comprising the Sia consensus algorithm and P2P networking code. The source code will be available in a public repository, and signed binaries will be published for each release.
Other parties may use this code to provide alternative full node software. For example, Nebulous may extend the minimal full node with wallet, renter, and host functionality. The source code of any such implementation may be submitted to the Foundation for review. If the code passes review, the Foundation will provide "endorsement signatures" for the commit hash used and for binaries compiled internally by the Foundation. Specifically, these signatures assert that the Foundation believes the software contains no consensus-breaking changes or other modifications to imported Foundation code. Endorsement signatures and Foundation-compiled binaries may be displayed and distributed by the receiving party, along with an appropriate disclaimer.
A minimal full node is not terribly useful on its own; the wallet, renter, host, and other extensions are what make Sia a proper developer platform. Currently, the only implementations of these extensions are maintained by Nebulous. The Foundation will contract Nebulous to ensure that these extensions continue to receive updates and enhancements. Later on, the Foundation intends to develop its own implementations of these extensions and others. As with the minimal node software, these extensions will be open source and available in public repositories for use by any Sia node software.
With the consensus code now managed by the Foundation, the task of implementing and orchestrating hardforks becomes its responsibility as well. When the Foundation determines that a hardfork is necessary (whether through internal discussion or via community petition), a formal proposal will be drafted and submitted for public review, during which arguments for and against the proposal may be submitted to a public repository. During this time, the hardfork code will be implemented, either by Foundation employees or by external contributors working closely with the Foundation. Once the implementation is finished, final arguments will be heard. The Foundation board will then vote whether to accept or reject the proposal, and announce their decision along with appropriate justification. Assuming the proposal was accepted, the Foundation will announce the block height at which the hardfork will activate, and will subsequently release source code and signed binaries that incorporate the hardfork code.
Regardless of the Foundation's decision, it is the community that ultimately determines whether a fork is accepted or rejected – nothing can change that. Foundation node software will never automatically update, so all forks must be explicitly adopted by users. Furthermore, the Foundation will provide replay and wipeout protection for its hard forks, protecting other chains from unintended or malicious reorgs. Similarly, the Foundation will ensure that any file contracts formed prior to a fork activation will continue to be honored on both chains until they expire.
Finally, the Foundation also intends to pursue scalability solutions for the Sia blockchain. In particular, work has already begun on an implementation of Utreexo, which will greatly reduce the space requirements of fully-validating nodes (allowing a full node to be run on a smartphone) while increasing throughput and decreasing initial sync time. A hardfork implementing Utreexo will be submitted to the community as per the process detailed above.
As this is the most important responsibility of the Foundation, it will receive a significant portion of the Foundation’s budget, primarily in the form of developer salaries and contracting agreements.

Support community services

We intend to allocate 25% of the Foundation Fund towards the community. This allocation will be held and disbursed in the form of siacoins, and will pay for grants, bounties, hackathons, and other community-driven endeavours.
Any community-run service, such as a Skynet portal, explorer or web wallet, may apply to have its costs covered by the Foundation. Upon approval, the Foundation will reimburse expenses incurred by the service, subject to the exact terms agreed to. The intent of these grants is not to provide a source of income, but rather to make such services "break even" for their operators, so that members of the community can enrich the Sia ecosystem without worrying about the impact on their own finances.

Ensure easy acquisition and storage of siacoins

Most users will acquire their siacoins via an exchange. The Foundation will provide support to Sia-compatible exchanges, and pursue relevant integrations at its discretion, such as Coinbase's new Rosetta standard. The Foundation may also release DEX software that enables trading cryptocurrencies without the need for a third party. (The Foundation itself will never operate as a money transmitter.)
Increasingly, users are storing their cryptocurrency on hardware wallets. The Foundation will maintain the existing Ledger Nano S integration, and pursue further integrations at its discretion.
Of course, all hardware wallets must be paired with software running on a computer or smartphone, so the Foundation will also develop and/or maintain client-side wallet software, including both full-node wallets and "lite" wallets. Community-operated wallet services, i.e. web wallets, may be funded via grants.
Like core software maintenance, this responsibility will be funded in the form of developer salaries and contracting agreements.

Protect the ecosystem

When it comes to cryptocurrency security, patching software vulnerabilities is table stakes; there are significant legal and social threats that we must be mindful of as well. As such, the Foundation will earmark a portion of its fund to defend the community from legal action. The Foundation will also safeguard the network from 51% attacks and other threats to network security by implementing softforks and/or hardforks where necessary.
The Foundation also intends to assist in the development of a new FOSS software license, and to solicit legal memos on various Sia-related matters, such as hosting in the United States and the EU.
In a broader sense, the establishment of the Foundation makes the ecosystem more robust by transferring core development to a more neutral entity. Thanks to its funding structure, the Foundation will be immune to various forms of pressure that for-profit companies are susceptible to.

Drive adoption of Sia

Although the overriding goal of the Foundation is to make Sia the best platform it can be, all that work will be in vain if no one uses the platform. There are a number of ways the Foundation can promote Sia and get it into the hands of potential users and developers.
In-person conferences are understandably far less popular now, but the Foundation can sponsor and/or participate in virtual conferences. (In-person conferences may be held in the future, permitting circumstances.) Similarly, the Foundation will provide prizes for hackathons, which may be organized by community members, Nebulous, or the Foundation itself. Lastly, partnerships with other companies in the cryptocurrency space—or the cloud storage space—are a great way to increase awareness of Sia. To handle these responsibilities, one of the early priorities of the Foundation will be to hire a marketing director.

Fund Management

The Foundation Fund will be controlled by a multisig address. Each member of the Foundation's board will control one of the signing keys, with the signature threshold to be determined once the final composition of the board is known. (This threshold may also be increased or decreased if the number of board members changes.) Additionally, one timelocked signing key will be controlled by David Vorick. This key will act as a “dead man’s switch,” to be used in the event of an emergency that prevents Foundation board members from reaching the signature threshold. The timelock ensures that this key cannot be used unless the Foundation fails to sign a transaction for several months.
On the 1st of each month, the Foundation will use its keys to transfer all siacoins in the Fund to two new addresses. The first address will be controlled by a high-security hot wallet, and will receive approximately one month's worth of Foundation expenditures. The second address, receiving the remaining siacoins, will be a modified version of the source address: specifically, it will increase the timelock on David Vorick's signing key by one month. Any other changes to the set of signing keys, such as the arrival or departure of board members, will be incorporated into this address as well.
The Foundation Fund is allocated in SC, but many of the Foundation's expenditures must be paid in USD or other fiat currency. Accordingly, the Foundation will convert, at its discretion, a portion of its monthly withdrawals to fiat currency. We expect this conversion to be primarily facilitated by private "OTC" sales to accredited investors. The Foundation currently has no plans to speculate in cryptocurrency or other assets.
Finally, it is important that the Foundation adds value to the Sia platform well in excess of the inflation introduced by the block subsidy. For this reason, the Foundation intends to provably burn, on a quarterly basis, any coins that it cannot allocate towards any justifiable expense. In other words, coins will be burned whenever doing so provides greater value to the platform than any other use. Furthermore, the Foundation will cap its SC treasury at 5% of the total supply, and will cap its USD treasury at 4 years’ worth of predicted expenses.
 
Addendum: Hardfork Timeline
We would like to see this proposal finalized and accepted by the community no later than September 30th. A new version of siad, implementing the hardfork, will be released no later than October 15th. The hardfork will activate at block 293220, which is expected to occur around 12pm EST on January 1st, 2021.
 
Addendum: Inflation specifics
The total supply of siacoins as of January 1st, 2021 will be approximately 45.243 GS. The initial subsidy of 1.57 GS thus increases the supply by 3.47%, and the total annual inflation in 2021 will be at most 10.4% (if zero coins are burned). In 2022, total annual inflation will be at most 6.28%, and will steadily decrease in subsequent years.
 

Conclusion

We see the establishment of the Foundation as an important step in the maturation of the Sia project. It provides the ecosystem with a sustainable source of funding that can be exclusively directed towards achieving Sia's ambitious goals. Compared to other projects with far deeper pockets, Sia has always punched above its weight; once we're on equal footing, there's no telling what we'll be able to achieve.
Nevertheless, we do not propose this change lightly, and have taken pains to ensure that the Foundation will act in accordance with the ideals that this community shares. It will operate transparently, keep inflation to a minimum, and respect the user's fundamental role in decentralized systems. We hope that everyone in the community will consider this proposal carefully, and look forward to a productive discussion.
submitted by lukechampine to siacoin [link] [comments]

[OWL WATCH] Waiting for "IOTA TIME" 30;

Disclaimer: This is sort of my own arbitrary editing, so there could be some misunderstandings.
I root for the spread of good spirits and transparency of IF.
📷
Hans Moog [IF]어제 오후 2:45
So why don't we just copy Avalanche? Well that's pretty simple ...
📷
Hans Moog [IF]어제 오후 2:47
1. It doesn't scale very well with the amount of nodes in the network that have no say in the consensus process but are merely consensus consuming nodes (i.e. sensors, edge devices and so on). If you assume that the network will never have more than a few thousand nodes then thats fine but if you want to build a DLT that can cope with millions of devices then it wont work because of the message complexity.
2. If somebody starts spamming conflicts, then the whole network will stop to confirm any transactions and will grind to a halt until the conflict spamming stops. Avalanche thinks that this is not a huge problem because an attacker would have to spend fees for spamming conflicts which means that he couldn't do this forever and would at some point run out of funds.
IOTA tries to build a feeless protocol and a consensus that stops to function if somebody spams conflicts is really not an option for us.
3. If a medium sized validator goes offline due to whatever reason, then the whole network will again stop to confirm any transactions because whenever a query for a nodes opinion can not be answered they reset the counter for consecutive successful voting rounds which will prevent confirmations. Since nodes need to open some ports to be available for queries it is super easy to DDOS validators and again bring the network confirmations to 0.
📷
Hans Moog [IF]어제 오후 3:05
4. Avalanche still processes transactions in "chunks/blocks" by only applying them after they have gone through some consensus process (gathered enough successfull voting rounds), which means that the nodes will waste a significant amount of time where they "wait" for the next chunk to be finished before the transactions are applied to the ledger state. IOTA tries to streamline this process by decoupling consensus and the booking of transactions by using the "parallel reality based ledger state" which means that nodes in IOTA will never waste any time "waiting" for decisions to be made. This will give us much higher throughput numbers.
📷
Hans Moog [IF]어제 오후 3:11
5. Avalanche has some really severe game theoretic problems where nodes are incentivized to attach their transactions to the already decided parts of the DAG because then things like conflict spam won't affect these transactions as badly as the transactions issued by honest nodes. If however every node would follow this "better and selfish" tip selection mechanism then the network will stop to work at all.
Overall the "being able to stop consensus" might not be too bad since you can't really do anything really bad (i.e. double spend) which is why we might not see these kind of attacks in the immediate future but just wait until a few DeFi apps are running on their platform where smart contracts are actually relying on more or less real time execution of the contracts. Then there might be some actual financial gains to be made if the contract halts and we might see alot of these things appear (including selfish tip selection).
Avalanche is barely a top 100 project and nobody attacks these kind of low value networks unless there is something to be gained from such an attack. Saying that the fact that its live on mainnet and hasn't been attacked in 3 weeks is a proof for its security is completely wrong.
Especially considering that 95% of all stake are controlled by avalanche itself
If you control > 50% of the voting power then you essentially control the whole network and attacks can mostly be ignored
I guess there is a reason for avalanche only selling 10% of the token supply to the public because then some of the named problems are less likely to appear
📷
Navin Ramachandran [IF]어제 오후 3:21
I have to say that wtf's suggestion is pretty condescending to all our researchers. It seems heavy on the troll aspect to suggest that we should ditch all our work because iota is only good at industrial adoption. Does wtf actually expect a response to this? Or is this grand standing?
📷
Hans Moog [IF]어제 오후 3:22
The whole argument of "why don't you just use X instead of trying to build a better version" is also a completely idiotic argument. Why did ETH write their own protocol if Bitcoin was already around? Well because they saw problems in Bitcoins approach and tried to improve it.
📷
Hans Moog [IF]어제 오후 3:27
u/Navin Ramachandran [IF] Its like most of his arguments ... remember when he said we should implement colored coins in 2nd layer smart contracts instead of the base layer because they would be more expressive (i.e. turing complete) completely discarding that 2nd layer smart contracts only really work if you have a consensus on data and therefore state for which you need the "traceability" of funds to create these kind of mini blockchains in the tangle?
Colored coins "enable" smart contracts and it wouldnt work the other way round - unless you have a platform that works exactly like ETH where all the nodes validate a single shared execution platform of the smart contracts which is not really scalable and is exactly what we are trying to solve with our approach.
📷
Navin Ramachandran [IF]어제 오후 3:28
Always easier to criticise than build something yourself. But yet he keeps posting these inflammatory posts.
At this point is there any doubt if he is making these comments constructively?
📷
Hans Moog [IF]어제 오후 3:43
If he at least would try to understand IOTAs vision ... then maybe he wouldn't have to ask things like "Why don't you just copy a tech that only works with fees"
📷
Hans Moog [IF]어제 오후 4:35
u/Shaar
I thought this would only be used to 'override' finality, eg if there were network splits. But not in normal consensus
That is not correct. Every single transaction gets booked on arrival using the parallel reality based ledger state. If there are conflicts then we create a "branch" (container in the ledger state) that represents the perception that this particular double spend would be accepted by consensus. After consensus is reached, the container is simply marked as "accepted" and all transactions that are associated with this branch are immediately confirmed as well. This allows us to make the node use all of its computing ressources 24/7 without having to wait for any kind of decision to be made and allows us to scale the throughput to its physical limits. That's the whole idea of the "parallel reality based ledger state" instead of designing a data structure that models the ledger state "after consensus" like everybody else is doing it is tailored to model the ledger state "before consensus" and then you just flip a flag to persist your decision. The "resync mechanism" also uses the branches to measure the amount of approval a certain perception of the ledger state receives. So if my own opinion is not in line with what the rest of the network has accepted (i.e. because I was eclipsed or because there was a network split), then I can use the weight of these branches to detect this "being out of sync" and can do another larger query to re-evaluate my decision.(수정됨)
Also what happens in IOTA if DRNG notes would fall out, does the network continue if no new RNGs appear for a while? Or will new nodes be added sufficiently fast to the DRNG committee that no one notices?
Its a comittee and not just a single DRNG provider. If a few nodes fail then it will still produce random numbers. And even if the whole comittee fails there are fallback RNG's that would be used instead
📷
Hans Moog [IF]어제 오후 4:58
And multiverse doesn't use FPC but only the weight of these branches in the same way as blockchain uses the longest chain wins consensus to choose between conflicts. So nodes simply attach their transactions to the transactions that they have seen first and if there are conflicts then you simply monitor which version received more approval and adjust your opinion accordingly.
📷
Hans Moog [IF]어제 오후 5:07
We started integrating some of the non-controversial concepts (like the approval reset switch) into FPC and are currently refactoring goshimmer to support this
We are also planning to make the big mana holders publish their opinion in the tangle as a public statement, which allows us to measure the rate of approval in a similar way as multiverse would do it
So its starting to converge a bit but we are still using FPC as a metastability breaking mechanism
Once the changes are implemented it should be pretty easy to simulate and test both approaches in parallel
📷
Serguei Popov [IF]어제 오후 5:53
So the ask is that we ditch all our work and fork Avalanche because it has not been attacked in the month or so it has been up?
u/Navin Ramachandran [IF] yeah, that's hilarious. Avalanche consensus (at least their WP version) is clearly scientifically unsound.
📷
Hans Moog [IF]어제 오후 9:43
u/wtf maybe you should research avalanche before proposing such a stupid idea
and you will see that what I wrote is actually true
📷
Hans Moog [IF]어제 오후 9:44
paying fees is what "protects" them atm
and simply the fact that nobody uses the network for anything of value yet
we cant rely on fees making attack vectors "inattractive"
📷
Serguei Popov [IF]어제 오후 10:17
well (1.) very obviously the metastability problems are not a problem in practice,
putting "very obviously" before questionable statements very obviously shows that you are seeking a constructive dialogue 📷 (to make metastability work, the adversary needs to more-or-less know the current opinion vectors of most of the honest participants; I don't see why a sufficiently well-connected adversary cannot query enough honest nodes frequently enough to achieve that)
(2.) .... you'd need an unpredictable number every few tens/hundreds milliseconds, but your DRNG can only produce one every O(seconds).
the above assumption (about "every few tens/hundreds milliseconds") is wrong
We've had this discussion before, where you argued that the assumptions in the FPC-BI paper (incl. "all nodes must be known") are not to be taken 100% strictly, and that the results are to be seen more of an indication of overall performance.
Aham, I see. So, unfortunately, all that time that I invested into explaining that stuff during our last conversation was for nothing. Again, very briefly. The contents of the FPC-BI paper is not "an indication of overall performance". It rather shows (to someone who actually read and understood the paper) why the approach is sound and robust, as it makes one understand what is the mechanism that causes the consensus phenomenon occur.
Yet you don't allow for that same argument to be valid for the "metastability" problem in avalanche,
Incorrect. It's not "that same argument". FPC-BI is a decent academic paper that has precisely formulated results and proofs. The Ava WP (the probabilistic part of it), on the other hand, does not contain proofs of what they call results. More importantly, they don't even show a clear path to those proofs. That's why their system is scientifically unsound.
even when there's a live network that shows that it doesn't matter.
No, it doesn't show that it doesn't matter. It only shows that it works when not properly attacked. Their WP doesn't contain any insight on why those attacks would be difficult/impossible.
📷
Hans Moog [IF]어제 오후 10:56
That proposal was so stupid - Avalanche does several things completely different and we are putting quite a bit og effort into our solution to pretty much fix all of Avalanches shortcomings
If we just wanted to have a working product and dont care about security or performance then we could have just forked a blockchaib
I am pretty confident that once we are done - its going to be extremely close to the besttheoretical thresholds that DLTs will ever be able to achieve for an unsharded baselayer
​-------------------------------------------------------------------------------------------------------------
📷
Bas어제 오전 2:43
Yesterday I was asked how a reasonably big company no one has heard of could best move forward implementing Access for thousands of locations worldwide. (Sorry for the vagueness, it’s all confidential.) They read the article and want to implement it because it seems to fit a problem they’re currently trying to solve. Such moves will vastly increase the utility of protocols like IOTA, and is what the speculation is built on. I do not think you can overestimate what impact Access is going to have. It’s cutting out the middleman for simple things; no server or service needed. That’s huge.
So yes, I think this space will continue to grow u/Coinnave

--------------------------------------------------------------------------------------------------------------
📷
Angelo Capossele [IF]2020.10.02.
In short: we are planning a new v0.3.0 release that should happen very soon. This version will bring fundamental changes to the structure of the entire codebase (but without additional features) so that progressing with the development will be easier and more consistent. We have also obtained outstanding results with the dRNG committee managed by the GoShimmer X-Team, so that will also be integral part of v0.3.0. After that, we will merge the Value Tangle with the Message Tangle, so to have only one Tangle and make the TSA and the orphanage easier to manage. And we are also progressing really well with Mana, that will be the focus after the merge. More or less this is what is going to happen this month.
We will release further details with the upcoming Research Status Update 📷

submitted by btlkhs to Iota [link] [comments]

Technical: Taproot: Why Activate?

This is a follow-up on https://old.reddit.com/Bitcoin/comments/hqzp14/technical_the_path_to_taproot_activation/
Taproot! Everybody wants it!! But... you might ask yourself: sure, everybody else wants it, but why would I, sovereign Bitcoin HODLer, want it? Surely I can be better than everybody else because I swapped XXX fiat for Bitcoin unlike all those nocoiners?
And it is important for you to know the reasons why you, o sovereign Bitcoiner, would want Taproot activated. After all, your nodes (or the nodes your wallets use, which if you are SPV, you hopefully can pester to your wallet vendoimplementor about) need to be upgraded in order for Taproot activation to actually succeed instead of becoming a hot sticky mess.
First, let's consider some principles of Bitcoin.
I'm sure most of us here would agree that the above are very important principles of Bitcoin and that these are principles we would not be willing to remove. If anything, we would want those principles strengthened (especially the last one, financial privacy, which current Bitcoin is only sporadically strong with: you can get privacy, it just requires effort to do so).
So, how does Taproot affect those principles?

Taproot and Your /Coins

Most HODLers probably HODL their coins in singlesig addresses. Sadly, switching to Taproot would do very little for you (it gives a mild discount at spend time, at the cost of a mild increase in fee at receive time (paid by whoever sends to you, so if it's a self-send from a P2PKH or bech32 address, you pay for this); mostly a wash).
(technical details: a Taproot output is 1 version byte + 32 byte public key, while a P2WPKH (bech32 singlesig) output is 1 version byte + 20 byte public key hash, so the Taproot output spends 12 bytes more; spending from a P2WPKH requires revealing a 32-byte public key later, which is not needed with Taproot, and Taproot signatures are about 9 bytes smaller than P2WPKH signatures, but the 32 bytes plus 9 bytes is divided by 4 because of the witness discount, so it saves about 11 bytes; mostly a wash, it increases blockweight by about 1 virtual byte, 4 weight for each Taproot-output-input, compared to P2WPKH-output-input).
However, as your HODLings grow in value, you might start wondering if multisignature k-of-n setups might be better for the security of your savings. And it is in multisignature that Taproot starts to give benefits!
Taproot switches to using Schnorr signing scheme. Schnorr makes key aggregation -- constructing a single public key from multiple public keys -- almost as trivial as adding numbers together. "Almost" because it involves some fairly advanced math instead of simple boring number adding, but hey when was the last time you added up your grocery list prices by hand huh?
With current P2SH and P2WSH multisignature schemes, if you have a 2-of-3 setup, then to spend, you need to provide two different signatures from two different public keys. With Taproot, you can create, using special moon math, a single public key that represents your 2-of-3 setup. Then you just put two of your devices together, have them communicate to each other (this can be done airgapped, in theory, by sending QR codes: the software to do this is not even being built yet, but that's because Taproot hasn't activated yet!), and they will make a single signature to authorize any spend from your 2-of-3 address. That's 73 witness bytes -- 18.25 virtual bytes -- of signatures you save!
And if you decide that your current setup with 1-of-1 P2PKH / P2WPKH addresses is just fine as-is: well, that's the whole point of a softfork: backwards-compatibility; you can receive from Taproot users just fine, and once your wallet is updated for Taproot-sending support, you can send to Taproot users just fine as well!
(P2WPKH and P2WSH -- SegWit v0 -- addresses start with bc1q; Taproot -- SegWit v1 --- addresses start with bc1p, in case you wanted to know the difference; in bech32 q is 0, p is 1)
Now how about HODLers who keep all, or some, of their coins on custodial services? Well, any custodial service worth its salt would be doing at least 2-of-3, or probably something even bigger, like 11-of-15. So your custodial service, if it switched to using Taproot internally, could save a lot more (imagine an 11-of-15 getting reduced from 11 signatures to just 1!), which --- we can only hope! --- should translate to lower fees and better customer service from your custodial service!
So I think we can say, very accurately, that the Bitcoin principle --- that YOU are in control of your money --- can only be helped by Taproot (if you are doing multisignature), and, because P2PKH and P2WPKH remain validly-usable addresses in a Taproot future, will not be harmed by Taproot. Its benefit to this principle might be small (it mostly only benefits multisignature users) but since it has no drawbacks with this (i.e. singlesig users can continue to use P2WPKH and P2PKH still) this is still a nice, tidy win!
(even singlesig users get a minor benefit, in that multisig users will now reduce their blockchain space footprint, so that fees can be kept low for everybody; so for example even if you have your single set of private keys engraved on titanium plates sealed in an airtight box stored in a safe buried in a desert protected by angry nomads riding giant sandworms because you're the frickin' Kwisatz Haderach, you still gain some benefit from Taproot)
And here's the important part: if P2PKH/P2WPKH is working perfectly fine with you and you decide to never use Taproot yourself, Taproot will not affect you detrimentally. First do no harm!

Taproot and Your Contracts

No one is an island, no one lives alone. Give and you shall receive. You know: by trading with other people, you can gain expertise in some obscure little necessity of the world (and greatly increase your productivity in that little field), and then trade the products of your expertise for necessities other people have created, all of you thereby gaining gains from trade.
So, contracts, which are basically enforceable agreements that facilitate trading with people who you do not personally know and therefore might not trust.
Let's start with a simple example. You want to buy some gewgaws from somebody. But you don't know them personally. The seller wants the money, you want their gewgaws, but because of the lack of trust (you don't know them!! what if they're scammers??) neither of you can benefit from gains from trade.
However, suppose both of you know of some entity that both of you trust. That entity can act as a trusted escrow. The entity provides you security: this enables the trade, allowing both of you to get gains from trade.
In Bitcoin-land, this can be implemented as a 2-of-3 multisignature. The three signatories in the multisgnature would be you, the gewgaw seller, and the escrow. You put the payment for the gewgaws into this 2-of-3 multisignature address.
Now, suppose it turns out neither of you are scammers (whaaaat!). You receive the gewgaws just fine and you're willing to pay up for them. Then you and the gewgaw seller just sign a transaction --- you and the gewgaw seller are 2, sufficient to trigger the 2-of-3 --- that spends from the 2-of-3 address to a singlesig the gewgaw seller wants (or whatever address the gewgaw seller wants).
But suppose some problem arises. The seller gave you gawgews instead of gewgaws. Or you decided to keep the gewgaws but not sign the transaction to release the funds to the seller. In either case, the escrow is notified, and if it can sign with you to refund the funds back to you (if the seller was a scammer) or it can sign with the seller to forward the funds to the seller (if you were a scammer).
Taproot helps with this: like mentioned above, it allows multisignature setups to produce only one signature, reducing blockchain space usage, and thus making contracts --- which require multiple people, by definition, you don't make contracts with yourself --- is made cheaper (which we hope enables more of these setups to happen for more gains from trade for everyone, also, moon and lambos).
(technology-wise, it's easier to make an n-of-n than a k-of-n, making a k-of-n would require a complex setup involving a long ritual with many communication rounds between the n participants, but an n-of-n can be done trivially with some moon math. You can, however, make what is effectively a 2-of-3 by using a three-branch SCRIPT: either 2-of-2 of you and seller, OR 2-of-2 of you and escrow, OR 2-of-2 of escrow and seller. Fortunately, Taproot adds a facility to embed a SCRIPT inside a public key, so you can have a 2-of-2 Taprooted address (between you and seller) with a SCRIPT branch that can instead be spent with 2-of-2 (you + escrow) OR 2-of-2 (seller + escrow), which implements the three-branched SCRIPT above. If neither of you are scammers (hopefully the common case) then you both sign using your keys and never have to contact the escrow, since you are just using the escrow public key without coordinating with them (because n-of-n is trivial but k-of-n requires setup with communication rounds), so in the "best case" where both of you are honest traders, you also get a privacy boost, in that the escrow never learns you have been trading on gewgaws, I mean ewww, gawgews are much better than gewgaws and therefore I now judge you for being a gewgaw enthusiast, you filthy gewgawer).

Taproot and Your Contracts, Part 2: Cryptographic Boogaloo

Now suppose you want to buy some data instead of things. For example, maybe you have some closed-source software in trial mode installed, and want to pay the developer for the full version. You want to pay for an activation code.
This can be done, today, by using an HTLC. The developer tells you the hash of the activation code. You pay to an HTLC, paying out to the developer if it reveals the preimage (the activation code), or refunding the money back to you after a pre-agreed timeout. If the developer claims the funds, it has to reveal the preimage, which is the activation code, and you can now activate your software. If the developer does not claim the funds by the timeout, you get refunded.
And you can do that, with HTLCs, today.
Of course, HTLCs do have problems:
Fortunately, with Schnorr (which is enabled by Taproot), we can now use the Scriptless Script constuction by Andrew Poelstra. This Scriptless Script allows a new construction, the PTLC or Pointlocked Timelocked Contract. Instead of hashes and preimages, just replace "hash" with "point" and "preimage" with "scalar".
Or as you might know them: "point" is really "public key" and "scalar" is really a "private key". What a PTLC does is that, given a particular public key, the pointlocked branch can be spent only if the spender reveals the private key of the given public key to you.
Another nice thing with PTLCs is that they are deniable. What appears onchain is just a single 2-of-2 signature between you and the developemanufacturer. It's like a magic trick. This signature has no special watermarks, it's a perfectly normal signature (the pledge). However, from this signature, plus some datta given to you by the developemanufacturer (known as the adaptor signature) you can derive the private key of a particular public key you both agree on (the turn). Anyone scraping the blockchain will just see signatures that look just like every other signature, and as long as nobody manages to hack you and get a copy of the adaptor signature or the private key, they cannot get the private key behind the public key (point) that the pointlocked branch needs (the prestige).
(Just to be clear, the public key you are getting the private key from, is distinct from the public key that the developemanufacturer will use for its funds. The activation key is different from the developer's onchain Bitcoin key, and it is the activation key whose private key you will be learning, not the developer's/manufacturer's onchain Bitcoin key).
So:
Taproot lets PTLCs exist onchain because they enable Schnorr, which is a requirement of PTLCs / Scriptless Script.
(technology-wise, take note that Scriptless Script works only for the "pointlocked" branch of the contract; you need normal Script, or a pre-signed nLockTimed transaction, for the "timelocked" branch. Since Taproot can embed a script, you can have the Taproot pubkey be a 2-of-2 to implement the Scriptless Script "pointlocked" branch, then have a hidden script that lets you recover the funds with an OP_CHECKLOCKTIMEVERIFY after the timeout if the seller does not claim the funds.)

Quantum Quibbles!

Now if you were really paying attention, you might have noticed this parenthetical:
(technical details: a Taproot output is 1 version byte + 32 byte public key, while a P2WPKH (bech32 singlesig) output is 1 version byte + 20 byte public key hash...)
So wait, Taproot uses raw 32-byte public keys, and not public key hashes? Isn't that more quantum-vulnerable??
Well, in theory yes. In practice, they probably are not.
It's not that hashes can be broken by quantum computes --- they're still not. Instead, you have to look at how you spend from a P2WPKH/P2PKH pay-to-public-key-hash.
When you spend from a P2PKH / P2WPKH, you have to reveal the public key. Then Bitcoin hashes it and checks if this matches with the public-key-hash, and only then actually validates the signature for that public key.
So an unconfirmed transaction, floating in the mempools of nodes globally, will show, in plain sight for everyone to see, your public key.
(public keys should be public, that's why they're called public keys, LOL)
And if quantum computers are fast enough to be of concern, then they are probably fast enough that, in the several minutes to several hours from broadcast to confirmation, they have already cracked the public key that is openly broadcast with your transaction. The owner of the quantum computer can now replace your unconfirmed transaction with one that pays the funds to itself. Even if you did not opt-in RBF, miners are still incentivized to support RBF on RBF-disabled transactions.
So the extra hash is not as significant a protection against quantum computers as you might think. Instead, the extra hash-and-compare needed is just extra validation effort.
Further, if you have ever, in the past, spent from the address, then there exists already a transaction indelibly stored on the blockchain, openly displaying the public key from which quantum computers can derive the private key. So those are still vulnerable to quantum computers.
For the most part, the cryptographers behind Taproot (and Bitcoin Core) are of the opinion that quantum computers capable of cracking Bitcoin pubkeys are unlikely to appear within a decade or two.
So:
For now, the homomorphic and linear properties of elliptic curve cryptography provide a lot of benefits --- particularly the linearity property is what enables Scriptless Script and simple multisignature (i.e. multisignatures that are just 1 signature onchain). So it might be a good idea to take advantage of them now while we are still fairly safe against quantum computers. It seems likely that quantum-safe signature schemes are nonlinear (thus losing these advantages).

Summary

I Wanna Be The Taprooter!

So, do you want to help activate Taproot? Here's what you, mister sovereign Bitcoin HODLer, can do!

But I Hate Taproot!!

That's fine!

Discussions About Taproot Activation

submitted by almkglor to Bitcoin [link] [comments]

Unpopular opinion - the economy has to become dynamic in order for it to have any longevity (and other musings on the progression)

Ain't no one gonna read this but here it goes!
The issue of progression has recently been gaining some traction in the community with Klean and DeadlySlob covering this topic recently.
Now any solution to this has an inherent issue associated with it - it'll be uncomfortable to someone. Whatever is done, it'll negatively affect someone, just by the fact of change alone. You cannot make something better by not changing anything. So anything you do or don't do, you will alienate a portion of your playerbase.
Early/Mid-game vs Late game.
Early and mid game is lauded, late game is considered boring. But why? For startes, firefights last longer, require more skill, movement, tactics and outsmarting your opponent. You value your life, you feel respect even for the shittiest of bullets. You have a feeling that the kill is earned. Guns have tons of recoil so you need to pick your shots. It's... I know it's illegal... but it's fun.
Late game however is plagued with a number of issues. Gear gets dominated by very similar loadouts that cover approx 10% of the gear in the game. There's nowhere to progress as you've reached the ceiling. The excitement from killing a kitted player diminishes as time goes as the economy saturates. People start being picky with their loot and only the good stuff brings any sort of satisfaction. The hideout provides a steady, predictable stream of income.
You let it run long enough it becomes a mindless PVP battleground.
Side note - the black and white fallacy of the makeup of the community.
Casuals vs hardcores. Rats vs Chads. Whenever a discussion pops up this dichotomy is always present. "Feature X hurts casuals but doesn't bother hardcore gamers playing 8h a day". No. Like anything in life the population of EFT is subject to the bellcurve distribution. There are hardcore sweaties grinding out the kappa within a week and there are also sunday gamers. Then there's everything else in between. Let's keep that in mind.
You don't need to be a streamer or play the game as a full time job to make money. We have a discord for 30+ yr old gamers with families and all of us were swimming in roubles and gear after 3 months of the past wipe. Sure it takes us longer than streamers, but still.
The meta
Taking weapons as an example. Different items have different stats (recoil, ergonomics, etc), some are obviously better than others which obviously makes them more sought after. There are also different ammo types for every caliber. Then lastly we come to the guns which directly tie into the first point, by their base stats and how much those can be brought down/up by attachments.
If you have a plethora of items that have different stats, there's sure to be an optimal loadout. If that optimal loadout is always available at an attainable price to the point where you can run it consistently, then there's really no reason to run anything less. This is the meta and at the moment it's basically a synonym for best in slot.
Appealing to a greater good such as gameplay variety is in vain because people will do everything to put themselves in the best possible position. If that means running whatever flavor of meta weapon that is - VAL, M4, FAL alongside top tier lvl 5 or 6 armor over and over and over and over again, so be it. We all know that's not the only way to get by in EFT, but all else being equal - top gear puts you on equal footing at minimum.
Trash contextualizes treasure. A rare item is not rare if everyone is running it. It's a normal item.
Gear minmaxing combined with a ceiling in progression create a situation where the game becomes stale, people get bored and we get chants for a wipe to releave the pressure.
Wipes
Wipes however, even at set intervals, are not the solution. Every wipe, in the absence of something fundamentally new, gives you (rapidly) diminishing returns. Doing the same quests over and over is an absolute drag. It's my 7th wipe and this time around I've really hit a brick wall with them. Now imagine doing them every 3 months. Maybe just do an inventory and trader level wipe? Yeah, that's just skipping one part of it and arriving at the same point but even quicker, considering how quickly you can make money.
The endpoint being - having enough money to run anything you want all the time without the fear of getting broke. Or in the abstract, having a big enough cushion to make any blow from a bad streak become inconsequential.
All of that is just a perpetuation of the same sawtooth progression. Grind, saturate, wipe, grind, saturate, wipe.
Side note - persistent character vs wiped character
I know there have been talks about having two characters - one persistent that's not wiped and one seasonal that is. On paper this might look like a good solution, but there are some problems.
POE players would have to chip in, but I reckong, that in a way this might become a form of matchmaking - the persistent character would be a mode for "sunday" players, while the wiped one for the sweats. I mean, maybe that's the way to go, but if the game is to gave any longevity, the persistent character will eventually face the same issues as the current game, it'll just take longer to develop.
Unpopular opinion - The economy is just a set of time and effort gated unlocks.
There have been multiple ideas to prolong a wipe, but in my view the fundamental issue with those is that they're based off the same linear progression - start from scratch and acumulate wealth until saturation. Some of these ideas include restricting labs till level X, locking behind a quest or just disabling it for a month. The problem with these is that it's just delaying the inevitable, while also giving a direct buff to those who get there first as they'll have the place virtually to themselves.
What follows is also the concept of "starting mid wipe", which essentially means that the gear disparity is so big that the further into a wipe, the more difficult it is to catch up. That effort is directly correlated with experience - the more experience you have the easier it is for you to reset or jump in midwipe. Extending a wipe potentially alleviates that by giving people more opportunity to catch up, but also pushes away from coming back/into the game if they recognize that it had passed their personal breakpoint where it's too hard / frustrating.
Perpetual mid-game
So out of all of that, a clearer picture emerges. We have to somehow find a solution to always have something to work for, but also not give the impression that you're up against an impenetrable wall.
That means that the game needs to pivot around something colloquially known as mid game. How would we define mid-game? That's another debate, but for the sake of the argument we could define that as something in the range of:
That would be the sort of mean loadout you can run on a consistent basis and you'd see the majority of the time. From the sentiment across the community, this seems to be the most enjoyable state of the game, where the sweetspot is in terms of protection and vulnerability, but allowing a lot of headroom for both variety and
Solutions
Now we must have to remember that there's a number of changes inbound that will alleviate some of the issues:
But those are sill far on the horizon.
The uncomfortable reality is that in order to truly balance that you have only a few choices. One is to go down the route of typical FPS tropes where every weapon type is perfectly balanced (i.e. shotguns powerfull but limited range, smg's low recoil, high ROF but weaker, dmrs powerful but high recoil and low ROF, etc). I don't think this will be ever a thing in the game.
Another one is to make attachments roughly equal and just attribute the differences to the tacticool visual factor. This would be realistic in a way, but would take away from the game.
The last one is to price them out. Literally. I'm of the unpopular opinion that endgame should not be a stage, it should be a state.
Dynamic pricing
I know I know, last time it failed spectacularly. However, that was a different flea market and the implementation was poorly thought out. Since it didn't have a pivot point to relate to it caused widespread inflation of even the most basic items and was prone to manipulation.
However the concept in principal has proven itself to work - M995 was essentially priced out of existence and forced people to look for alternatives like M855A1 or M856A1 or different calibers alltogether. Even the sweaties of sweats got a bit excited when they killed someone with 3 60rounders filled with M995. See where I'm going with this?
The execution was poor and poorly thought out.
But how about a different implementation? Adjust the prices based on how much an item is (or is not) bought compared to other items of the same item type. Most popular items' price (of a specific category) increases, while the least popular one decreases.
This could also be coupled with (or as an alternative) an additional rarity factor which would sort of specify how volatile the price is. Continuing the ammo example M995 would have the highest rarity factor and would be very prone to price increases, while the likes of M855 would be considered common and have a much more stable price.
Obviously this would be subject to long term trends and would not happen overnight. But the main aim is to dynamically scale the economy to the general wealth of the playerbase around a certain pivot point which we established before as the mid-game.
This would be a quite significant blow to the uberchads as they would unironically struggle to maintain a profit from their runs. And yes, some of them would still probably be able to pull this off, but remember what we said about the bell curve? It's just about making them so insignificant in the global player pool that they'd be a very rare occurance.
Global item pools
This idea has been floated around by Nikita some time ago but we have no ETA on this. In short - for some items, there is only a set amount that is present in circulation. For example there are only X amount of ReapIR's in the entire economy - spawns, traders, player stashes. If everyone hoards them in their stashes - thats where they'll remain. They don't spawn on maps, they're not sold on traders. Only until they're lost they get reinjected into the item pool.
This idea should be reserved only for the absolute top tier OP items. Something that you'd get all giddy if found/looted and you'd contemplate taking it out.
Side note, the X amount should scale to the active playerbase, which could be something like a weekly or biweekly moving average of people actively playing the game in a set period.
Insurance
This one is a bit controversial but also attributes to some of the in game inflation and gear recirculation. If you run a large squad, even if one of you dies, there's a high chance someone will survive and secure others' gear. And even if all of you die, something's bound to come back.
This might be a bit controversial, but I think group size should have a debuff to the chance of getting your gear back the higher the bigger your squad size, for example an incremental 10% chance for each additional squadmate.
Hideout adjustments
Right now fuel consumption is static no matter how much stuff is going on. What if the fuel consumption rate was tied to the size of your bitcoin farm and the amount of crafting going on.
Additionally hideout appliances could wear out and require maintenance, which would grant them performance debuffs like increased crafting time.
Dynamic stocks.
Right now stocks are predictable. You have the same amount of items at a set interval. Things like traders missing some items or not getting a restock due to broken supply lines, which can be cheekily tied into...
Dynamic global events/quests
Such as as getting rid of scavs on a particular location to remove the roadblock. These might be done per player or as a global event where everyone has to chip in.
Summary
The subject is difficult and solutions are not simple, but what I do know is that eventually Tarkov will have settle into an identity which will come with a sacrifice either at the expense of vision or mainstream popularity.
Thank you for coming to my TEDTalk. I'd like to give a heartfelt thank you to the 5 people that read this wall of text.
submitted by sunseeker11 to EscapefromTarkov [link] [comments]

Radix Solving DeFi Risk

RADIX: THE PROTECTION AGAINST DEFI RISK
Radix is a First-layer protocol for DeFi. Currently, DeFi applications are based on protocols that are not scalable Radix has created a robust, secure, and scalable protocol for building applications and tokens. Based on existing public ledgers’ success, the Radix protocol is an unauthorized framework within which DeFi services can be developed and operated. Radix claims to solve two of the biggest problems in DeFi: scalability and security. Overall, the blockchain-based decentralized finance (DeFi) space is still evolving but offers a compelling value proposition where individuals and institutions have broader access to financial applications without the need for a trusted broker.
WHAT IS DECENTRALIZED FINANCE (DeFi)
Decentralized finance is a new financial system based on public blockchains such as Bitcoin and Ethereum. After all, Bitcoin and Ethereum are not just digital currencies. They are essentially open-source networks that can be used to change the way the world economy works. DeFi is a significant project to decentralize traditional core use cases such as trading, lending, investment, asset management, payments, and insurance on blockchains. DeFi relies on decentralized applications or protocols (dApps). By running these dApps on a blockchain, a peer-to-peer financial network is provided. Each dApp can be combined with each other like Lego blocks. Smart contracts act as connectors comparable to perfectly defined APIs in traditional systems. Rarely will you get great rewards without huge risk Just like every other industry, the DeFi system also has its own risks and issues. Unfortunately, many DeFi system users underestimate the risk associated with automated loan protocol’s impressive interest rates.
FORMS OF DEFI RISKS
When working with DeFi solutions, it is essential to consider technical and procedural risks as well. Technical risk means assessing potential weak spots in the hardware and software behind a product or service. This is important for decentralized applications (dApps) Procedural risk can be viewed as similar to technical risk, but rather than considering the product or service, procedural risk examines how users can be directed to use the product in undesirable ways that could compromise their safety.
RADIX SOLVING DeFi RISKS
DeFi is worth more than $ 8 billion. However, DeFi requires fast and minimal transaction fees and secure building systems to reach its full potential. DeFi applications must be scalable and compilable. Protocols such as Ethereum 2.0, Polkadot, and Cosmos solve the wrong scaling issues and don’t attract others, according to Piers. According to Piers, mainstream DeFi needs a bottom-up DLT platform for DeFi_ to work for both users and developers. This is the purpose of Radix.
Incentives are needed to attract developers for the DeFi ecosystem to continue to grow. Radix has an innovative incentive program for developers that allows them to take advantage of the applications they contribute to. Radix has two significant innovations: The first is Cerberus, the scalable consensus protocol. Thanks to its highly fragmented data structure and its unique application layer, Cerberus can process many transactions.
The second innovation is the Radix Engine, a developer interface that enables public ledger to be quickly deployed in a secure environment. Radix Engine is the Radix application layer. In Crypto Chat, Piers anticipates that DeFi will have more liquidity in the transition market than any other exchange in the next decade. “The key component of DeFi is how liquidity can move between applications and products.” The Radix protocol is a combination of four core technologies that solve four significant issues to the growth of DeFi. It is a platform where transactions are fast with minimal transaction fee and high security. The scale is unlimited, and connections between applications. dApps can be created quickly and rely on their ability to safely manage user resources. Builders are rewarded directly from the platform for additional contributions, both large and small. It is a platform intended to serve as the basis for mainstream DeFi on a global scale.
Each of the four technologies on the Radix platform represents a breakthrough in the Defi-related issue we want to share with the world. At a critical technology milestone last year, the Radix team overcame DeFi’s core scalability problem by using its technology to over 1 million transactions per second, a performance that exceeds five times the NASDAQ at its peak.
THE POSSIBLE IMPACT OF DECENTRALIZED FINANCE
Five ways decentralized finance can affect the universe 1. Accessing financial services across borders With decentralized finance, all you need is an internet connection to access financial services in any part of the world. There are several barriers to access in the current system: Status: citizenship, document, identity, etc. Lack of Wealth: High Entry-Level Funds to Access Financial Services Location: Great distance to business economies and financial service providers A senior trader in a financial company will have the same access as a farmer in India’s remote area in a decentralized financial system.
  1. Affordable cross-border payments Decentralized funding eliminates costly intermediaries to make remittance services more affordable to the world’s population. In today’s system, sending money across borders is too expensive for people — the average global transfer fee is 7%. In decentralized financial services, transfer fees can be less than 3%.
  2. More privacy and security With decentralized finance, users have responsibility for their assets and can securely transact without a major party’s approval. In this day and age, parents risk people’s wealth and knowledge if they don’t protect them.
  3. Censorship-resistant transactions In a decentralized financial network, transactions are immutable, and blockchains cannot be closed by central institutions such as governments, central banks, or large corporations.
There are poor governance and authoritarianism. Users can exit the decentralized financial system to protect their assets. Venezuelans, for example, are already using Bitcoin to protect their wealth from government manipulation and hyperinflation. 5.Ease of use
With plug and play applications, users can spontaneously access and use the decentralized financial without centralized finance. With a decentralized system, anyone can get a loan from any part of the world through interoperable apps, invest in a business, pay off the loan, and make a profit.
writen By Naphtali Dabuk for more information visit https://t.me/radix_dlt https://twitter.com/radixdlt http://www.radixdlt.com/
submitted by d_realnafty to Radix [link] [comments]

I'm kinda ok with MCO -> CRO Swap; a indepth personal view

EDIT: this post https://www.reddit.com/Crypto_com/comments/i2yhuz/open_letter_to_kris_from_one_of_cdcs_biggest/ from u/CryptoMines expresses my sentiments and concerns better than I could ever put into words myself. I'd say read his/her post instead.
Very long post ahead, but TL;DR, I actually see this swap as a positive change, despite fearing for what it may do to my portofolio, and having mixed feelings about its consequences on CDC reputation.Before I start, for the sake of context and bias, here's my personal situation as a CDC user:
  1. I'm just a average Joe, with a 500 MCO Jade card. I bough 50 MCO at 5,22€ in September 2019 and staked for Ruby, then bough 440 MCO at 2.47€ in March 2020 and upgraded to Jade. The total amount of MCO I own is currently 515, and everything above the 500 stake is cashback rewards.
  2. I bought MCO exclusively for the card and bonus Earn interest benefits, and had no plans to unstake my MCO. Now with the swap, definetly won't unstake.
  3. The MCO -> CRO conversion rates increased the fiat value of my MCO in about 1000€.
  4. I own a decent amount of CRO, wich I bought at ~0,031€ in March 2020.
  5. The country where I live is crypto friendly and completely crypto-tax free; I only have to pay income tax if I deposit a certain threshold of fiat in my bank.
Take all these factors into account as possible (if not major) influencers or bias on my opinions; both the emotional and economical ones. Call me a fool or a devil's advocate if you want, but keep your torches and pitchforks down. As we say here on Reddit: "Remember the human".-----------------------------------------------------------------------------------------------------------------------------------------------------
Like all of you, I woke up to find this anouncement, wich came right the #[email protected] out of nowere, and gives you little to no options. Good or bad, this announcement arrived as basicly a "comply or die" choice. Emotionally, this came as both terrifying and disgusting; but rationally, I cannot blame CDC for it.
Because wether we like it or not, CDC is a centralized company, and the MCO tokens were never a stock or legally binding contract; something wich pretty much every crypto company or ICO warns in their T&C and risk warnings. Not to mention the mostly unregulated status of the cryptocurrency and. I'll call this "dishonest" any day, but I cannot see it as a "scammy" since I can't see how they broke any rules or terms.
A scammer would take your money/assets away, but CDC is offering you to swap it for another asset wich you can sell right away if you want. And at current price, it is still worth more or less as much fiat as MCO cost at the 5 $/€ wich was more or less the comunity standard used for calculating the card prices. And by that, I mean that the fiat value of 50/500/5000 MCO (as CRO) is actually not far from the 250/2500/25'000 $/€ that the comunity commonly used as standard when calculating the ROI and (under)valuation of MCO.
So CDC is at least trying to give us the option to get (some) our money back, and not at a unfair rate. If you happened to buy MCO at a price higher than this, I can't see how that's CDC's fault, just as I don't see anyone blaming Bitcoin or Altcoins for getting them stuck at the top of the 2017 bubble burst.
I read many posts in this reddit calling this a "backstab" and "betrayal" of early investors and for the people who "believed in MCO". Emotionally, I share your sentiment.But after thinking it for a while, I'd say this was actually very rewarding for early investors and long term MCO supporters. As CDC clearly sates in the swap rules; nobody is going to lose their card tier or MCO stake benefits (at least not yet), and your stake DOES NOT unstake automatically after 180 days. Actually, so far they never did unstake automatically, you had to manually unstake yourself.
With this in mind, everyone who already got their cards, or at least staked MCO to reserve one, basicly got them 3-5 times cheaper than future users; and IMHO, now the $/€ price of cards feels more fair and sustainable compared to their benefits.So in a sense, everyone who supported and believed on the MCO for its utility (i.e. the card and app benefits) has been greatly rewarded with perks that they get to keep, but are now out of reach for a lot of people.Likewise, the people who believed and invested in CRO (for whatever reason), have also been rewarded, as their CRO tokens now have more utility.
So either the price of CRO crashes down to around 0.05 $/€, or the people who bought MCO/CRO early or cheap are now massively benefited. But then again, so is everyone who bought or mined Bitcoin in its early days, or invested in Bitcoin at crucial points of its history... how is that unfair? Some people bought Ethereum at 1'400 $ on a mix of hopes/promises that it would continue to rise; it didn't. And even today with DeFi and ETH 2.0 ever closer, it is still far from that price.
And I know what some of you are thinking: "The cards aren't avaiable in my country yet, that's why I didn't buy/stake."Well, they weren't avaiable in my country either when I staked 50 MCO. Heck, the cards weren't avaiable in anyones country when MCO started, but many people still bought it and staked it. That's exacly what "early adopter", "long supporter" and "believing in MCO" means.
On the other hand, the people who invested on MCO as a speculative asset and decided to HODL and hoard MCO, hoping for its price to moon and then sell MCO at big profit, had their dreams mercilessly crushed by this swap... and good lord, I feel their pain.But this is also where I'll commit the sin of being judgemental, because IMHO, speculating on MCO never made any sense to me; MCO was a utility token, not a value token, so it should not (and could not) ever be worth more than the value of its utility. That's basicly how stablecoins and PAXG are able to stay stable; because nobody will pay more/less than the value of the asset/service they represent.
Tough now that I'm looking at the new card stake tiers in CRO, I have to give credit to the MCO hodlers I just now criticised; maybe you were right all along. Unless the price of CRO crashes or corrects, I wich case, I un-rest my case.
One thing I'll agree with everyone tough, is that I fell that CDC just suckerpunched it's comunity. Because even if we have no vote on its decisions (wich again, we aren't necessarily entitled to, since they are a privante and centralized business) they should/could have warned that this was in their plans well in advance; if anything to allow those who wouldn't like it to exit this train calmly.
Also the CRO stake duration reset. The mandatory reset of your CRO stake for taking advantage of the early swap bonus feels like another gut-punch.
-----------------------------------------------------------------------------------------------------------------------------------------------------
Now that we got emotional feelings out of the way, here's my sentiment about how this will affect the overall CDC ecossystem.
One common criticism of the sustainability of MCO was that its supply cap could never allow a large number of cards to be issued, and how could CDC keep paying the cashbacks and rebates. On the oposite corner, one of the major criticisms of the sustainability of CRO, was it's ridiculously huge supply cap and inflation caused by the gradual un-freezing and release of more CRO into the system.
But now that MCO and CRO became one, it might just have made both issues more sustainable. Now the huge supply cap of CRO makes more sense, as it allows a much larger number of future users to stake for cards (at higher costs, but still). And because most card cashback is small parcels, this large supply also ensures that CDC can keep paying said cashbacks for a long time; especially since it can be semi-renewable trough the trading fees we pay in CRO.
Before this, the MCO you got as cashback had no use, other than selling it for fiat or speculate on its price. But CRO can be used, at the very least, to receive a discount on trading fees. And everytime you pay trading fees in CRO or spend CRO on a Syndicate event, some of that CRO goes back to CDC, wich they can use to keep paying the cahsback/rebates.
And keep in mind, the technicalities of CRO can be changed, as well as the perks and utilities it can be used for. So even if this current model doesn't fix everything (wich it probably doesn't) it can still be changed to patch problems or expand its use.
Another obvious potentially positive outcome of this, is that now CDC only has to focus on 1 token, so it makes it easier to manage and drive its value. People complained that CDC was neglecting MCO over promoting CRO, but now they can focus on both services (cards/exchange) at the same time. Sure, this might not bring much advantage to the common customer, but its probably a major resource saver and optimizer at corporate levels; wich in the long term ultimately benefits its customers.
Much like Ethereum is undergoing major changes to ensure its scalability, the crypto companies themselves also have to change to acommodate the growing number of users, especially as the cryptomarket and DeFi are growing and becoming more competitive. Business strategies that were once successfull became obsolete, and exchanges that once held near-monopolies had to adjust to rising competitors. There is no reason why CDC shouldn't keep up with this, or at least try to.
Point is, the financial markets, crypto or otherwise, are not a status quo haven. And when something is wrong, something has to be changed, even if it costs. The very rise of cryptocurrencies and blockchain, wich is why we are here in the first place, is a perfect example of this, as it experiments and provides alternatives to legacy/traditional products and technologies.
Was this the best solution to its current problems? Is this what will protect us as customers from a potentially unsustainable business model? I have no idea.
This change ripped me too from my previous more or less relaxed status quo (the safety of the value of the CRO I bough for cheap), along with CRO late investors wich now probably fear for the devaluation of their CRO. To say nothing of the blow this represents for my trust (and I believe everyone elses trust) on CDC and its public relations. It's not what CDC did, it's how they did it.
------------------------------------------------------------------------------------------------------------------------------------------------
Wether you actually bothered to read all I wrote or just skip everything (can't blame you), I'm eager to hear your opinions and whatever criticisms on my opinions you may have.
If you just want to vent at me, you are welcome too; now you can raise your pitchforks and torches.
submitted by BoilingGarbage to Crypto_com [link] [comments]

Why Osana takes so long? (Programmer's point of view on current situation)

I decided to write a comment about «Why Osana takes so long?» somewhere and what can be done to shorten this time. It turned into a long essay. Here's TL;DR of it:
The cost of never paying down this technical debt is clear; eventually the cost to deliver functionality will become so slow that it is easy for a well-designed competitive software product to overtake the badly-designed software in terms of features. In my experience, badly designed software can also lead to a more stressed engineering workforce, in turn leading higher staff churn (which in turn affects costs and productivity when delivering features). Additionally, due to the complexity in a given codebase, the ability to accurately estimate work will also disappear.
Junade Ali, Mastering PHP Design Patterns (2016)
Longer version: I am not sure if people here wanted an explanation from a real developer who works with C and with relatively large projects, but I am going to do it nonetheless. I am not much interested in Yandere Simulator nor in this genre in general, but this particular development has a lot to learn from for any fellow programmers and software engineers to ensure that they'll never end up in Alex's situation, especially considering that he is definitely not the first one to got himself knee-deep in the development hell (do you remember Star Citizen?) and he is definitely not the last one.
On the one hand, people see that Alex works incredibly slowly, equivalent of, like, one hour per day, comparing it with, say, Papers, Please, the game that was developed in nine months from start to finish by one guy. On the other hand, Alex himself most likely thinks that he works until complete exhaustion each day. In fact, I highly suspect that both those sentences are correct! Because of the mistakes made during early development stages, which are highly unlikely to be fixed due to the pressure put on the developer right now and due to his overall approach to coding, cost to add any relatively large feature (e.g. Osana) can be pretty much comparable to the cost of creating a fan game from start to finish. Trust me, I've seen his leaked source code (don't tell anybody about that) and I know what I am talking about. The largest problem in Yandere Simulator right now is its super slow development. So, without further ado, let's talk about how «implementing the low hanging fruit» crippled the development and, more importantly, what would have been an ideal course of action from my point of view to get out. I'll try to explain things in the easiest terms possible.
  1. else if's and lack any sort of refactoring in general
The most «memey» one. I won't talk about the performance though (switch statement is not better in terms of performance, it is a myth. If compiler detects some code that can be turned into a jump table, for example, it will do it, no matter if it is a chain of if's or a switch statement. Compilers nowadays are way smarter than one might think). Just take a look here. I know that it's his older JavaScript code, but, believe it or not, this piece is still present in C# version relatively untouched.
I refactored this code for you using C language (mixed with C++ since there's no this pointer in pure C). Take a note that else if's are still there, else if's are not the problem by itself.
The refactored code is just objectively better for one simple reason: it is shorter, while not being obscure, and now it should be able to handle, say, Trespassing and Blood case without any input from the developer due to the usage of flags. Basically, the shorter your code, the more you can see on screen without spreading your attention too much. As a rule of thumb, the less lines there are, the easier it is for you to work with the code. Just don't overkill that, unless you are going to participate in International Obfuscated C Code Contest. Let me reiterate:
Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.
Antoine de Saint-Exupéry
This is why refactoring — activity of rewriting your old code so it does the same thing, but does it quicker, in a more generic way, in less lines or simpler — is so powerful. In my experience, you can only keep one module/class/whatever in your brain if it does not exceed ~1000 lines, maybe ~1500. Splitting 17000-line-long class into smaller classes probably won't improve performance at all, but it will make working with parts of this class way easier.
Is it too late now to start refactoring? Of course NO: better late than never.
  1. Comments
If you think that you wrote this code, so you'll always easily remember it, I have some bad news for you: you won't. In my experience, one week and that's it. That's why comments are so crucial. It is not necessary to put a ton of comments everywhere, but just a general idea will help you out in the future. Even if you think that It Just Works™ and you'll never ever need to fix it. Time spent to write and debug one line of code almost always exceeds time to write one comment in large-scale projects. Moreover, the best code is the code that is self-evident. In the example above, what the hell does (float) 6 mean? Why not wrap it around into the constant with a good, self-descriptive name? Again, it won't affect performance, since C# compiler is smart enough to silently remove this constant from the real code and place its value into the method invocation directly. Such constants are here for you.
I rewrote my code above a little bit to illustrate this. With those comments, you don't have to remember your code at all, since its functionality is outlined in two tiny lines of comments above it. Moreover, even a person with zero knowledge in programming will figure out the purpose of this code. It took me less than half a minute to write those comments, but it'll probably save me quite a lot of time of figuring out «what was I thinking back then» one day.
Is it too late now to start adding comments? Again, of course NO. Don't be lazy and redirect all your typing from «debunk» page (which pretty much does the opposite of debunking, but who am I to judge you here?) into some useful comments.
  1. Unit testing
This is often neglected, but consider the following. You wrote some code, you ran your game, you saw a new bug. Was it introduced right now? Is it a problem in your older code which has shown up just because you have never actually used it until now? Where should you search for it? You have no idea, and you have one painful debugging session ahead. Just imagine how easier it would be if you've had some routines which automatically execute after each build and check that environment is still sane and nothing broke on a fundamental level. This is called unit testing, and yes, unit tests won't be able to catch all your bugs, but even getting 20% of bugs identified at the earlier stage is a huge boon to development speed.
Is it too late now to start adding unit tests? Kinda YES and NO at the same time. Unit testing works best if it covers the majority of project's code. On the other side, a journey of a thousand miles begins with a single step. If you decide to start refactoring your code, writing a unit test before refactoring will help you to prove to yourself that you have not broken anything without the need of running the game at all.
  1. Static code analysis
This is basically pretty self-explanatory. You set this thing once, you forget about it. Static code analyzer is another «free estate» to speed up the development process by finding tiny little errors, mostly silly typos (do you think that you are good enough in finding them? Well, good luck catching x << 4; in place of x <<= 4; buried deep in C code by eye!). Again, this is not a silver bullet, it is another tool which will help you out with debugging a little bit along with the debugger, unit tests and other things. You need every little bit of help here.
Is it too late now to hook up static code analyzer? Obviously NO.
  1. Code architecture
Say, you want to build Osana, but then you decided to implement some feature, e.g. Snap Mode. By doing this you have maybe made your game a little bit better, but what you have just essentially done is complicated your life, because now you should also write Osana code for Snap Mode. The way game architecture is done right now, easter eggs code is deeply interleaved with game logic, which leads to code «spaghettifying», which in turn slows down the addition of new features, because one has to consider how this feature would work alongside each and every old feature and easter egg. Even if it is just gazing over one line per easter egg, it adds up to the mess, slowly but surely.
A lot of people mention that developer should have been doing it in object-oritented way. However, there is no silver bullet in programming. It does not matter that much if you are doing it object-oriented way or usual procedural way; you can theoretically write, say, AI routines on functional (e.g. LISP)) or even logical language if you are brave enough (e.g. Prolog). You can even invent your own tiny programming language! The only thing that matters is code quality and avoiding the so-called shotgun surgery situation, which plagues Yandere Simulator from top to bottom right now. Is there a way of adding a new feature without interfering with your older code (e.g. by creating a child class which will encapsulate all the things you need, for example)? Go for it, this feature is basically «free» for you. Otherwise you'd better think twice before doing this, because you are going into the «technical debt» territory, borrowing your time from the future by saying «I'll maybe optimize it later» and «a thousand more lines probably won't slow me down in the future that much, right?». Technical debt will incur interest on its own that you'll have to pay. Basically, the entire situation around Osana right now is just a huge tale about how just «interest» incurred by technical debt can control the entire project, like the tail wiggling the dog.
I won't elaborate here further, since it'll take me an even larger post to fully describe what's wrong about Yandere Simulator's code architecture.
Is it too late to rebuild code architecture? Sadly, YES, although it should be possible to split Student class into descendants by using hooks for individual students. However, code architecture can be improved by a vast margin if you start removing easter eggs and features like Snap Mode that currently bloat Yandere Simulator. I know it is going to be painful, but it is the only way to improve code quality here and now. This will simplify the code, and this will make it easier for you to add the «real» features, like Osana or whatever you'd like to accomplish. If you'll ever want them back, you can track them down in Git history and re-implement them one by one, hopefully without performing the shotgun surgery this time.
  1. Loading times
Again, I won't be talking about the performance, since you can debug your game on 20 FPS as well as on 60 FPS, but this is a very different story. Yandere Simulator is huge. Once you fixed a bug, you want to test it, right? And your workflow right now probably looks like this:
  1. Fix the code (unavoidable time loss)
  2. Rebuild the project (can take a loooong time)
  3. Load your game (can take a loooong time)
  4. Test it (unavoidable time loss, unless another bug has popped up via unit testing, code analyzer etc.)
And you can fix it. For instance, I know that Yandere Simulator makes all the students' photos during loading. Why should that be done there? Why not either move it to project building stage by adding build hook so Unity does that for you during full project rebuild, or, even better, why not disable it completely or replace with «PLACEHOLDER» text for debug builds? Each second spent watching the loading screen will be rightfully interpreted as «son is not coding» by the community.
Is it too late to reduce loading times? Hell NO.
  1. Jenkins
Or any other continuous integration tool. «Rebuild a project» can take a long time too, and what can we do about that? Let me give you an idea. Buy a new PC. Get a 32-core Threadripper, 32 GB of fastest RAM you can afford and a cool motherboard which would support all of that (of course, Ryzen/i5/Celeron/i386/Raspberry Pi is fine too, but the faster, the better). The rest is not necessary, e.g. a barely functional second hand video card burned out by bitcoin mining is fine. You set up another PC in your room. You connect it to your network. You set up ramdisk to speed things up even more. You properly set up Jenkins) on this PC. From now on, Jenkins cares about the rest: tracking your Git repository, (re)building process, large and time-consuming unit tests, invoking static code analyzer, profiling, generating reports and whatever else you can and want to hook up. More importantly, you can fix another bug while Jenkins is rebuilding the project for the previous one et cetera.
In general, continuous integration is a great technology to quickly track down errors that were introduced in previous versions, attempting to avoid those kinds of bug hunting sessions. I am highly unsure if continuous integration is needed for 10000-20000 source lines long projects, but things can be different as soon as we step into the 100k+ territory, and Yandere Simulator by now has approximately 150k+ source lines of code. I think that probably continuous integration might be well worth it for Yandere Simulator.
Is it too late to add continuous integration? NO, albeit it is going to take some time and skills to set up.
  1. Stop caring about the criticism
Stop comparing Alex to Scott Cawton. IMO Alex is very similar to the person known as SgtMarkIV, the developer of Brutal Doom, who is also a notorious edgelord who, for example, also once told somebody to kill himself, just like… However, being a horrible person, SgtMarkIV does his job. He simply does not care much about public opinion. That's the difference.
  1. Go outside
Enough said. Your brain works slower if you only think about games and if you can't provide it with enough oxygen supply. I know that this one is probably the hardest to implement, but…
That's all, folks.
Bonus: Do you think how short this list would have been if someone just simply listened to Mike Zaimont instead of breaking down in tears?
submitted by Dezhitse to Osana [link] [comments]

How EpiK Protocol “Saved the Miners” from Filecoin with the E2P Storage Model?

How EpiK Protocol “Saved the Miners” from Filecoin with the E2P Storage Model?

https://preview.redd.it/n5jzxozn27v51.png?width=2222&format=png&auto=webp&s=6cd6bd726582bbe2c595e1e467aeb3fc8aabe36f
On October 20, Eric Yao, Head of EpiK China, and Leo, Co-Founder & CTO of EpiK, visited Deep Chain Online Salon, and discussed “How EpiK saved the miners eliminated by Filecoin by launching E2P storage model”. ‘?” The following is a transcript of the sharing.
Sharing Session
Eric: Hello, everyone, I’m Eric, graduated from School of Information Science, Tsinghua University. My Master’s research was on data storage and big data computing, and I published a number of industry top conference papers.
Since 2013, I have invested in Bitcoin, Ethereum, Ripple, Dogcoin, EOS and other well-known blockchain projects, and have been settling in the chain circle as an early technology-based investor and industry observer with 2 years of blockchain experience. I am also a blockchain community initiator and technology evangelist
Leo: Hi, I’m Leo, I’m the CTO of EpiK. Before I got involved in founding EpiK, I spent 3 to 4 years working on blockchain, public chain, wallets, browsers, decentralized exchanges, task distribution platforms, smart contracts, etc., and I’ve made some great products. EpiK is an answer to the question we’ve been asking for years about how blockchain should be landed, and we hope that EpiK is fortunate enough to be an answer for you as well.
Q & A
Deep Chain Finance:
First of all, let me ask Eric, on October 15, Filecoin’s main website launched, which aroused everyone’s attention, but at the same time, the calls for fork within Filecoin never stopped. The EpiK protocol is one of them. What I want to know is, what kind of project is EpiK Protocol? For what reason did you choose to fork in the first place? What are the differences between the forked project and Filecoin itself?
Eric:
First of all, let me answer the first question, what kind of project is EpiK Protocol.
With the Fourth Industrial Revolution already upon us, comprehensive intelligence is one of the core goals of this stage, and the key to comprehensive intelligence is how to make machines understand what humans know and learn new knowledge based on what they already know. And the knowledge graph scale is a key step towards full intelligence.
In order to solve the many challenges of building large-scale knowledge graphs, the EpiK Protocol was born. EpiK Protocol is a decentralized, hyper-scale knowledge graph that organizes and incentivizes knowledge through decentralized storage technology, decentralized autonomous organizations, and generalized economic models. Members of the global community will expand the horizons of artificial intelligence into a smarter future by organizing all areas of human knowledge into a knowledge map that will be shared and continuously updated for the eternal knowledge vault of humanity
And then, for what reason was the fork chosen in the first place?
EpiK’s project founders are all senior blockchain industry practitioners and have been closely following the industry development and application scenarios, among which decentralized storage is a very fresh application scenario.
However, in the development process of Filecoin, the team found that due to some design mechanisms and historical reasons, the team found that Filecoin had some deviations from the original intention of the project at that time, such as the overly harsh penalty mechanism triggered by the threat to weaken security, and the emergence of the computing power competition leading to the emergence of computing power monopoly by large miners, thus monopolizing the packaging rights, which can be brushed with computing power by uploading useless data themselves.
The emergence of these problems will cause the data environment on Filecoin to get worse and worse, which will lead to the lack of real value of the data in the chain, high data redundancy, and the difficulty of commercializing the project to land.
After paying attention to the above problems, the project owner proposes to introduce multi-party roles and a decentralized collaboration platform DAO to ensure the high value of the data on the chain through a reasonable economic model and incentive mechanism, and store the high-value data: knowledge graph on the blockchain through decentralized storage, so that the lack of value of the data on the chain and the monopoly of large miners’ computing power can be solved to a large extent.
Finally, what differences exist between the forked project and Filecoin itself?
On the basis of the above-mentioned issues, EpiK’s design is very different from Filecoin, first of all, EpiK is more focused in terms of business model, and it faces a different market and track from the cloud storage market where Filecoin is located because decentralized storage has no advantage over professional centralized cloud storage in terms of storage cost and user experience.
EpiK focuses on building a decentralized knowledge graph, which reduces data redundancy and safeguards the value of data in the distributed storage chain while preventing the knowledge graph from being tampered with by a few people, thus making the commercialization of the entire project reasonable and feasible.
From the perspective of ecological construction, EpiK treats miners more friendly and solves the pain point of Filecoin to a large extent, firstly, it changes the storage collateral and commitment collateral of Filecoin to one-time collateral.
Miners participating in EpiK Protocol are only required to pledge 1000 EPK per miner, and only once before mining, not in each sector.
What is the concept of 1000 EPKs, you only need to participate in pre-mining for about 50 days to get this portion of the tokens used for pledging. The EPK pre-mining campaign is currently underway, and it runs from early September to December, with a daily release of 50,000 ERC-20 standard EPKs, and the pre-mining nodes whose applications are approved will divide these tokens according to the mining ratio of the day, and these tokens can be exchanged 1:1 directly after they are launched on the main network. This move will continue to expand the number of miners eligible to participate in EPK mining.
Secondly, EpiK has a more lenient penalty mechanism, which is different from Filecoin’s official consensus, storage and contract penalties, because the protocol can only be uploaded by field experts, which is the “Expert to Person” mode. Every miner needs to be backed up, which means that if one or more miners are offline in the network, it will not have much impact on the network, and the miner who fails to upload the proof of time and space in time due to being offline will only be forfeited by the authorities for the effective computing power of this sector, not forfeiting the pledged coins.
If the miner can re-submit the proof of time and space within 28 days, he will regain the power.
Unlike Filecoin’s 32GB sectors, EpiK’s encapsulated sectors are smaller, only 8M each, which will solve Filecoin’s sector space wastage problem to a great extent, and all miners have the opportunity to complete the fast encapsulation, which is very friendly to miners with small computing power.
The data and quality constraints will also ensure that the effective computing power gap between large and small miners will not be closed.
Finally, unlike Filecoin’s P2P data uploading model, EpiK changes the data uploading and maintenance to E2P uploading, that is, field experts upload and ensure the quality and value of the data on the chain, and at the same time introduce the game relationship between data storage roles and data generation roles through a rational economic model to ensure the stability of the whole system and the continuous high-quality output of the data on the chain.
Deep Chain Finance:
Eric, on the eve of Filecoin’s mainline launch, issues such as Filecoin’s pre-collateral have aroused a lot of controversy among the miners. In your opinion, what kind of impact will Filecoin bring to itself and the whole distributed storage ecosystem after it launches? Do you think that the current confusing FIL prices are reasonable and what should be the normal price of FIL?
Eric:
Filecoin mainnet has launched and many potential problems have been exposed, such as the aforementioned high pre-security problem, the storage resource waste and computing power monopoly caused by unreasonable sector encapsulation, and the harsh penalty mechanism, etc. These problems are quite serious, and will greatly affect the development of Filecoin ecology.
These problems are relatively serious, and will greatly affect the development of Filecoin ecology, here are two examples to illustrate. For example, the problem of big miners computing power monopoly, now after the big miners have monopolized computing power, there will be a very delicate state — — the miners save a file data with ordinary users. There is no way to verify this matter in the chain, whether what he saved is uploaded by himself or someone else. And after the big miners have monopolized computing power, there will be a very delicate state — — the miners will save a file data with ordinary users, there is no way to verify this matter in the chain, whether what he saved is uploaded by himself or someone else. Because I can fake another identity to upload data for myself, but that leads to the fact that for any miner I go to choose which data to save. I have only one goal, and that is to brush my computing power and how fast I can brush my computing power.
There is no difference between saving other people’s data and saving my own data in the matter of computing power. When I save someone else’s data, I don’t know that data. Somewhere in the world, the bandwidth quality between me and him may not be good enough.
The best option is to store my own local data, which makes sense, and that results in no one being able to store data on the chain at all. They only store their own data, because it’s the most economical for them, and the network has essentially no storage utility, no one is providing storage for the masses of retail users.
The harsh penalty mechanism will also severely deplete the miner’s profits, because DDOS attacks are actually a very common attack technique for the attacker, and for a big miner, he can get a very high profit in a short period of time if he attacks other customers, and this thing is a profitable thing for all big miners.
Now as far as the status quo is concerned, the vast majority of miners are actually not very well maintained, so they are not very well protected against these low-DDOS attacks. So the penalty regime is grim for them.
The contradiction between the unreasonable system and the demand will inevitably lead to the evolution of the system in a more reasonable direction, so there will be many forked projects that are more reasonable in terms of mechanism, thus attracting Filecoin miners and a diversion of storage power.
Since each project is in the field of decentralized storage track, the demand for miners is similar or even compatible with each other, so miners will tend to fork the projects with better economic benefits and business scenarios, so as to filter out the projects with real value on the ground.
For the chaotic FIL price, because FIL is also a project that has gone through several years, carrying too many expectations, so it can only be said that the current situation has its own reasons for existence. As for the reasonable price of FIL there is no way to make a prediction because in the long run, it is necessary to consider the commercialization of the project to land and the value of the actual chain of data. In other words, we need to keep observing whether Filecoin will become a game of computing power or a real value carrier.
Deep Chain Finance:
Leo, we just mentioned that the pre-collateral issue of Filecoin caused the dissatisfaction of miners, and after Filecoin launches on the main website, the second round of space race test coins were directly turned into real coins, and the official selling of FIL hit the market phenomenon, so many miners said they were betrayed. What I want to know is, EpiK’s main motto is “save the miners eliminated by Filecoin”, how to deal with the various problems of Filecoin, and how will EpiK achieve “save”?
Leo:
Originally Filecoin’s tacit approval of the computing power makeup behavior was to declare that the official directly chose to abandon the small miners. And this test coin turned real coin also hurt the interests of the loyal big miners in one cut, we do not know why these low-level problems, we can only regret.
EpiK didn’t do it to fork Filecoin, but because EpiK to build a shared knowledge graph ecology, had to integrate decentralized storage in, so the most hardcore Filecoin’s PoRep and PoSt decentralized verification technology was chosen. In order to ensure the quality of knowledge graph data, EpiK only allows community-voted field experts to upload data, so EpiK naturally prevents miners from making up computing power, and there is no reason for the data that has no value to take up such an expensive decentralized storage resource.
With the inability to make up computing power, the difference between big miners and small miners is minimal when the amount of knowledge graph data is small.
We can’t say that we can save the big miners, but we are definitely the optimal choice for the small miners who are currently in the market to be eliminated by Filecoin.
Deep Chain Finance:
Let me ask Eric: According to EpiK protocol, EpiK adopts the E2P model, which allows only experts in the field who are voted to upload their data. This is very different from Filecoin’s P2P model, which allows individuals to upload data as they wish. In your opinion, what are the advantages of the E2P model? If only voted experts can upload data, does that mean that the EpiK protocol is not available to everyone?
Eric:
First, let me explain the advantages of the E2P model over the P2P model.
There are five roles in the DAO ecosystem: miner, coin holder, field expert, bounty hunter and gateway. These five roles allocate the EPKs generated every day when the main network is launched.
The miner owns 75% of the EPKs, the field expert owns 9% of the EPKs, and the voting user shares 1% of the EPKs.
The other 15% of the EPK will fluctuate based on the daily traffic to the network, and the 15% is partly a game between the miner and the field expert.
The first describes the relationship between the two roles.
The first group of field experts are selected by the Foundation, who cover different areas of knowledge (a wide range of knowledge here, including not only serious subjects, but also home, food, travel, etc.) This group of field experts can recommend the next group of field experts, and the recommended experts only need to get 100,000 EPK votes to become field experts.
The field expert’s role is to submit high-quality data to the miner, who is responsible for encapsulating this data into blocks.
Network activity is judged by the amount of EPKs pledged by the entire network for daily traffic (1 EPK = 10 MB/day), with a higher percentage indicating higher data demand, which requires the miner to increase bandwidth quality.
If the data demand decreases, this requires field experts to provide higher quality data. This is similar to a library with more visitors needing more seats, i.e., paying the miner to upgrade the bandwidth.
When there are fewer visitors, more money is needed to buy better quality books to attract visitors, i.e., money for bounty hunters and field experts to generate more quality knowledge graph data. The game between miners and field experts is the most important game in the ecosystem, unlike the game between the authorities and big miners in the Filecoin ecosystem.
The game relationship between data producers and data storers and a more rational economic model will inevitably lead to an E2P model that generates stored on-chain data of much higher quality than the P2P model, and the quality of bandwidth for data access will be better than the P2P model, resulting in greater business value and better landing scenarios.
I will then answer the question of whether this means that the EpiK protocol will not be universally accessible to all.
The E2P model only qualifies the quality of the data generated and stored, not the roles in the ecosystem; on the contrary, with the introduction of the DAO model, the variety of roles introduced in the EpiK ecosystem (which includes the roles of ordinary people) is not limited. (Bounty hunters who can be competent in their tasks) gives roles and possibilities for how everyone can participate in the system in a more logical way.
For example, a miner with computing power can provide storage, a person with a certain domain knowledge can apply to become an expert (this includes history, technology, travel, comics, food, etc.), and a person willing to mark and correct data can become a bounty hunter.
The presence of various efficient support tools from the project owner will lower the barriers to entry for various roles, thus allowing different people to do their part in the system and together contribute to the ongoing generation of a high-quality decentralized knowledge graph.
Deep Chain Finance:
Leo, some time ago, EpiK released a white paper and an economy whitepaper, explaining the EpiK concept from the perspective of technology and economy model respectively. What I would like to ask is, what are the shortcomings of the current distributed storage projects, and how will EpiK protocol be improved?
Leo:
Distributed storage can easily be misunderstood as those of Ali’s OceanDB, but in the field of blockchain, we should focus on decentralized storage first.
There is a big problem with the decentralized storage on the market now, which is “why not eat meat porridge”.
How to understand it? Decentralized storage is cheaper than centralized storage because of its technical principle, and if it is, the centralized storage is too rubbish for comparison.
What incentive does the average user have to spend more money on decentralized storage to store data?
Is it safer?
Existence miners can shut down at any time on decentralized storage by no means save a share of security in Ariadne and Amazon each.
More private?
There’s no difference between encrypted presence on decentralized storage and encrypted presence on Amazon.
Faster?
The 10,000 gigabytes of bandwidth in decentralized storage simply doesn’t compare to the fiber in a centralized server room. This is the root problem of the business model, no one is using it, no one is buying it, so what’s the big vision.
The goal of EpiK is to guide all community participants in the co-construction and sharing of field knowledge graph data, which is the best way for robots to understand human knowledge, and the more knowledge graph data there is, the more knowledge a robot has, the more intelligent it is exponentially, i.e., EpiK uses decentralized storage technology. The value of exponentially growing data is captured with linearly growing hardware costs, and that’s where the buy-in for EPK comes in.
Organized data is worth a lot more than organized hard drives, and there is a demand for EPK when robots have the need for intelligence.
Deep Chain Finance:
Let me ask Leo, how many forked projects does Filecoin have so far, roughly? Do you think there will be more or less waves of fork after the mainnet launches? Have the requirements of the miners at large changed when it comes to participation?
Leo:
We don’t have specific statistics, now that the main network launches, we feel that forking projects will increase, there are so many restricted miners in the market that they need to be organized efficiently.
However, we currently see that most forked projects are simply modifying the parameters of Filecoin’s economy model, which is undesirable, and this level of modification can’t change the status quo of miners making up computing power, and the change to the market is just to make some of the big miners feel more comfortable digging up, which won’t help to promote the decentralized storage ecology to land.
We need more reasonable landing scenarios so that idle mining resources can be turned into effective productivity, pitching a 100x coin instead of committing to one Fomo sentiment after another.
Deep Chain Finance:
How far along is the EpiK Protocol project, Eric? What other big moves are coming in the near future?
Eric:
The development of the EpiK Protocol is divided into 5 major phases.
(a) Phase I testing of the network “Obelisk”.
Phase II Main Network 1.0 “Rosetta”.
Phase III Main Network 2.0 “Hammurabi”.
(a) The Phase IV Enrichment Knowledge Mapping Toolkit.
The fifth stage is to enrich the knowledge graph application ecology.
Currently in the first phase of testing network “Obelisk”, anyone can sign up to participate in the test network pre-mining test to obtain ERC20 EPK tokens, after the mainnet exchange on a one-to-one basis.
We have recently launched ERC20 EPK on Uniswap, you can buy and sell it freely on Uniswap or download our EpiK mobile wallet.
In addition, we will soon launch the EpiK Bounty platform, and welcome all community members to do tasks together to build the EpiK community. At the same time, we are also pushing forward the centralized exchange for token listing.
Users’ Questions
User 1:
Some KOLs said, Filecoin consumed its value in the next few years, so it will plunge, what do you think?
Eric:
First of all, the judgment of the market is to correspond to the cycle, not optimistic about the FIL first judgment to do is not optimistic about the economic model of the project, or not optimistic about the distributed storage track.
First of all, we are very confident in the distributed storage track and will certainly face a process of growth and decline, so as to make a choice for a better project.
Since the existing group of miners and the computing power already produced is fixed, and since EpiK miners and FIL miners are compatible, anytime miners will also make a choice for more promising and economically viable projects.
Filecoin consumes the value of the next few years this time, so it will plunge.
Regarding the market issues, the plunge is not a prediction, in the industry or to keep learning iteration and value judgment. Because up and down market sentiment is one aspect, there will be more very important factors. For example, the big washout in March this year, so it can only be said that it will slow down the development of the FIL community. But prices are indeed unpredictable.
User2:
Actually, in the end, if there are no applications and no one really uploads data, the market value will drop, so what are the landing applications of EpiK?
Leo: The best and most direct application of EpiK’s knowledge graph is the question and answer system, which can be an intelligent legal advisor, an intelligent medical advisor, an intelligent chef, an intelligent tour guide, an intelligent game strategy, and so on.
submitted by EpiK-Protocol to u/EpiK-Protocol [link] [comments]

Bitcoin endgame

Hi everybody I am still getting started in the world of cryptocurrencies, focusing on understanding how bitcoin works. I chose bitcoin because it is the most established cryptocurrency and with the higher chance of becoming a full fledge currency worldwide, being used by everyone (at least that’s what I think at the moment, I might be wrong here) I know that the practical limit for bitcoin is 21 million bitcoins and each bitcoin can be divided into 100 million satoshi. My question is, assuming bitcoin takes over as a single global currency and everyone is using it, isn’t the total amount of satoshi too little? I mean, if you split the total amount of bitcoin by everyone in the world, each person receives a relative small amount of bitcoin (I did a rough estimate, and from my estimates, each person would have the equivalent of 30€ in bitcoin)
Thanks in advance for dedicating the time to this weird question :)
EDIT:
Thank you all for the answers. Like it has been said I agree that the main obstacle for bitcoin to become the one currency on a global scale is politics because no one, in this case governments would like to loose control of their money.
With the increase in market cap and the subdivision of satoshi’s (part that I was unaware of), bitcoin could be used as the global currency because, at the same time, bitcoin would have enough “value” to represent the global economy and would be divisible enough to a reasonable value for cheap stuff like a bottle of water.
The main technical issue that I see at the moment is the difference to fiat in how bitcoin is stored (hardware wallets) and transferred between two entities (addresses and private keys). For me it is something that I am starting to understand but I think it would be close to impossible for the majority of people that are older / not so tech oriented. I haven’t yet bought bitcoin, just got a bit of exposure to it using Revolut and decided to explore it on a deeper level.
One other thing that was referred was that subdividing satoshi’s is similar to “printing” money and would lead to inflation. I understand why this is being said, because creating money or dividing the current supply into smaller amounts can be seen as having the same overall effect. I think that the key difference in the division of satoshi’s is that it is not controlled by a central authority. For example if a new base unit that corresponded to 1/100 of a satoshi was created, everyone would be affected equally. When money is being printed by a central bank or government, they are increasing their wealth by making everyone else poorer, since they are increasing the percentage of money they have (note that I am not an economist and this explanation probably is flawed).
submitted by AlexDRibeiro to BitcoinBeginners [link] [comments]

(20M) living in Spain and looking to get started in stock market and other investments. Any advice is welcomw

I left my job because I was tired of it, it was affecting my physical and mental health. I currently live with my parents so I have no monthly expenses. I could perfectly live with 0€ in my bank account. I am currently studying marketing in college and I have a higher education degree in marketing and advertising. I have 3.000€ in my bank account and 0 debts. I would like to get some advice in how to get started in the stock market and other kind of investments such as crypto currency and any other thing that could give me some money as I dont have time for working and also its hard to find a job rn in Spain.
I can take high risks but I'd prefer not to. I would like to make 0-300€ a month if possible, I hope that is a realistic objective.
Only investment I have in mind right now is putting 1.000€ into bitcoin as it keeps increasing its value, and its starting to be accepted in many websites, banks and services such as paypal, so the more it becomes used worldwide the more it's value increases. I also know the supply of BTC is limited so when all of them are mined there wont be any way of "printing more" therefore no devaluation/inflation is possible.
What's holding me back from doing this investment is that it would be a very long term investment, and that bitcoin is sustained in nothing but the faith of people who buy and sell it. Also its volatibility.
What do you guys recommend?
submitted by danielrp00 to eupersonalfinance [link] [comments]

PT Super Public Chain has the potential to outperform all mainstream public chains

PT Super Public Chain has the potential to outperform all mainstream public chains
Public chains have become a topic that is widely discussed, and it used to be all about comparing who had the better headlines and everyone was talking about Blockchain 3.0. Their normal method was to take a prominent indicator and make comparisons between them and a mainstream public chain in the market, and then come to a predetermined conclusion. Few articles objectively and comprehensively compare the current mainstream public chains in the market and give the public an intuitive and credible conclusion. Today, we are going to break this bad habit of this industry and make a horizontal comparison of the current mainstream public chains, and thus intuitively and objectively tell you what the differences are between public chains.

https://preview.redd.it/heczp9tj29t51.jpg?width=1772&format=pjpg&auto=webp&s=cbac81221394b3d5294b8c6eb1be52581b0d725f
Contestants:
First generation public chain: BTC (father of blockchain)
Second generation public chain representatives: ETH, EOS
Third generation public chain representatives: polkaDOT, VDS, PT public chain
Criteria for the different generations: classification

  • First generation public chain: mainly referring to the transformation from theory to the implementation of blockchain, Bitcoin is recognized as the representative of the first generation of blockchain.
  • Second generation public chain: the main purpose is to explore the possibility of blockchain applications, among which ETH is the representative. Although EOS claims to be the third-generation public chain, it really should belong to the enhanced version of the second-generation public chain.
  • Third generation public chain is on top of the second generation. It has generally found its niche, and comes with more valuable technical public chains, such as VDS resonance, or polkaDOT’s superb cross chain innovation, PT public chain's full chain compatibility mode and ultra-high throughput.
Four dimensions for comparison:

  • Public chain consensus: the core indicator of public chain innovation, which directly affects the performance and security of the public chain, with a top score of five stars.
Usage scenarios of the public chain: it mainly reflects the commercial value of the public
  • TPS: before upgrading to the 2.0 network, the TPS of ETH can only handle 30 transactions per second, which is considered to be in the weak category, and thus I can only give 1.5 stars.

  • Influence: the representative of the second generation blockchain, giving it 4 stars.
To sum up, the average score of the second-generation public chain ETH is 3.625 stars.
EOS (Second generation public chain):

  • Consensus: OPOS is a new set of consensus created in addition to POW, which perfectly avoids the shortcomings of the POW consensus mechanism. However, its own security has not been recognized by the community. Coupled with the existence of a centralized "referee mechanism", DPOS on the EOS chain has always received mixed reviews in the industry. At this stage, it only deserves a 3-star rating.

  • Usage scenario: thanks to the improvement of the consensus mechanism, EOS has the possibility of being suitable for large-scale applications. There have also been popular applications such as Pixel Wars. However, due to the high rental costs of CPU resources, developers are becoming more and more distant from the EOS ecosystem, and it has been a long time since there have been any popular new applications, so only 2.5 stars.

  • TPS: EOS claimed to have a million concurrency at the beginning of development, but the actual tested volume is 3800 transactions per second at the moment. Compared with the first two public chains, this was a major breakthrough, scoring it 5 stars.

  • Influence: at the beginning when launched, there was a massive wave of interest but then there were no popular applications and the ecosystem has gradually withered away, so influence gets only 2 stars.
To sum up, the average score of the second-generation public chain EOS is 3.125 stars.
polkaDOT (Third generation public chain)

  • Consensus: NPOS is an updated consensus, based on an improved DPOS. The double confirmation mechanism makes it more difficult for nodes to be corrupted, but the cost is higher, so taking into account the utility of the public chain performance, I’ll give it 4 stars.

  • Usage scenarios: polkaDOT provides a cross-chain relay chain mode, and its own positioning is to connect highways without public chains. At present, there is still a lack of real demand in terms of practical scenarios. So far, polkaDOT has been in a tepid state, giving it 3.5 stars.

  • TPS: the processing is 1000 transactions per second on the chain, and taking into account the safety and efficiency, this is a relatively ideal performance, giving an overall score is 4 stars. United States, but at present, based on its budding state, it can only score 2 stars for the time being.
    To sum up, the average score of the third-generation public chain PT is 3.5 stars.
In summary, from the score point of view, the scores of the three generations of public chain are beyond my original expectations. The second-generation public chain is still the preferred platform for mainstream applications, with mature technology, a friendly development environment and low user education costs being the key advantages. However, the third-generation public chain, as a latecomer, generally has a lower score. The technical purposes of the third-generation public chain are very obvious, so there is the phenomenon of partiality. Some of the main functions came close to a full score, while the rest scored relatively low.
https://preview.redd.it/vic3k7j049t51.jpg?width=3334&format=pjpg&auto=webp&s=27a24ce91445d25881d25a8ac8abbbdffbf82a8f
I am very optimistic about the PT public chain. As a latecomer, PT public chain has the first decentralized Dpos+Spos consensus mechanism in the blockchain circle. It has high security, high privacy levels, high efficiency, high capacity expansion, supports compatibility and cross chain technologies, which makes it easier to carry out multi technology development. It also innovates the efficiency of the destruction mechanism of mining coalescence, effectively improving the shortcomings of the traditional mining allocation mechanism, eliminating speculative players, and increasing the participation rate of consensus innovation in the technology and methodology. However, due to the weakness of the latecomers themselves, the ecosystem is in its infancy, and there has not been enough time for all of the innovative mechanisms to be tested by the market, so I can
chain and is an important basis for measuring the commercial prospects of the public chain, with a top score of five stars.
  • TPS of public chain: represents the maximum potential upper limit of the public chain, with top score of five stars.
Influence / achievement of public chain: represents the contribution value of the public chain to the blockchain industry, with a top score of five stars.
These four dimensions mainly consider the practicability of the public chain, and focus on the commercial value itself, as I believe that productivity is the only standard by which to measure technology.
BTC (first generation public chain)

  • Consensus: POW (workload proof mechanism) this is a consensus with the highest degree of security and decentralization so far. The disadvantage is that it is less efficient, because it is the pioneer of POW, so we will give it a great score of 4 stars.

  • Usage scenario: digital currency (payments, transfers, asset management) although BTC is currently the most commonly accepted digital currency, it has a single purpose. We give it a score of 2.5 stars.

  • TPS: it can only process 7 transactions per second, which is also the major factor restricting the popularity of BTC at present. This was a technology compromise in the initial start-up stage, we can only rate it 1.5 stars.

  • Influence: the father of blockchain, the founder of digital currency, it has to be the top score of 5 stars.
To sum up, the average score of the first-generation public chain is 3.25 stars.
ETH (Second generation public chain)

  • Consensus: POW (Proof of Workload) it is the same as bitcoin's consensus mechanism, and its advantages and disadvantages are also basically the same. The difference is that ETH has added an algorithm against mining machines, which makes the computing power more decentralized. In addition, the witness mechanism of DPOS was introduced in the era of ETH2.0, which means I can give a score of 4 stars.

  • Usage scenario: in terms of applications, ETH is invincible. It has the largest user group and developer team in the industry. It has produced popular and even quasi killer applications like Cryptocat, FOMO3D and DEFI, which is the king of blockchain applications. This gives it a full score of five stars.
  • Influence: the influence is limited to a small portion of the technology exploration community, giving it a 2.5-star rating.
To sum up, the average score of the third-generation public chain polkaDOT is 3.5 stars.
VDS (Third generation public chain)

  • Consensus: due to the lack of powerful computing power to support it, the safety and the performance of the public chain have basically not been considered, so, only 1.5 stars can be given.

  • Usage scenario: it has its own resonance mechanism, and it is no exaggeration to say that VDS was the most popular public chain in 2019. It immediately gained explosive popularity in the industry. We have to give credit to this kind of strength, scoring it 4.5 stars.

  • TPS: the official marker is 60,000 transactions per second, but there is no way to evaluate it, and only one star is given.

  • Influence: the once explosive project is now a thing of the past. All of the ecological hot spots have already been extinguished and only one star can be given.
To sum up, the average score of the third-generation public chain VDS is 1.875 stars.
PT Public Chain (Third generation public chain)

  • Consensus: DPOS+SPOS, double consensus. This is the application of the latest blockchain research results, which effectively balances the differing demands of security, efficiency and decentralization. It may become the mainstream in the future. Here I’m giving a high score of 4.5 stars.

  • Usage scenario: built-in cross chain, quantum computer confrontation, and has the first multi-currency aggregate mining mode. At present, PT public chain is the only fair chain in existence with zero pre-mining, zero reservations and zero handling charges. It is a public chain with long-term development potential. The PT public chain has just been put online, and the current ecosystem is still far from perfect, so, only scoring 4 stars.

  • TPS: under the normal condition of the main chain, the processing speed of 4,000 transactions per second is excellent, but the PT public chain also has a hidden power-up mode. Once the fragmentation mechanism is enabled, processing speed of up to 100,000 transactions per second can be achieved, which is quite amazing data. At present, it can only be given 3.5 stars based on the normal state.
Influence: as a public chain, PT has utilized a lot of new technology research, and also has a lot of innovation built into the operations. Recently, it has become popular in Europe and the only give it a low score unfortunately.
However, this score can only be used as a reference based on the specific current environment. Over time, the public chain ecosystem has had its ups and downs, user migration, pop-ups, technology iterations, etc., I still believe that the public chain, with its technical advantages and model innovations, such as PT, can stand out in the market, and time will be the best witness. Just as the PT white paper says, you will slowly get rich together if you make the right choice.
submitted by According_Ticket7936 to u/According_Ticket7936 [link] [comments]

Factors that Determine the Price of Bitcoin? Why Bitcoin value is falling down? Analysis of future of ... INTERESTING!!! The current Bitcoin Price Pattern happened ... How Will Futures Affect Bitcoin's Price Bitcoin: How Cryptocurrencies Work - YouTube

Many consider Bitcoin’s current price as a reflection of the value that this network has brought to the world. In an ideal situation, that would be accurate. Unfortunately, the situation Bitcoin is in, along with what it’s up against, is far from ideal. Bitcoin's value increases steadily over time, while Bitcoin's price attempts to find it. - Erik Vorhees . If you’re seeking to determine ... How halving events also impact the value of crypto. Currently the bitcoin network rewards miners with 12.5 bitcoin every ten minutes. However, this started at 50 and has gone through a halving event four years to bring it to the current number. This means that about 1800 coins are created each day, with the next halving event scheduled to take ... With Bitcoin, billions of dollars’ worth of value can be transferred across the world with incredible security, in 10 minutes (though it’s best to wait one hour for the best security), typically for $1 or less. This is not a trusted Visa or PayPal third party payment. It is final settlement, akin to a bank moving gold or cash in an armored truck with high security. This is not how you ... Thus, the performance of altcoins also affects the Bitcoin price. Although usually, altcoins react to Bitcoin’s movements, sometimes the opposite also happens. In the table below, you can see the correlation between the price of Bitcoin and other major cryptocurrencies (1 means the assets move in one direction and react to external market factors, 0 – neutral correlation, -1 – movement ... Bitcoin is a cryptocurrency developed in 2009 by Satoshi Nakamoto, the name given to the unknown creator (or creators) of this virtual currency.Transactions are recorded in a blockchain, which ...

[index] [18952] [19512] [4380] [10371] [28345] [1302] [18679] [22992] [19358] [43646]

Factors that Determine the Price of Bitcoin?

Digital Currency Has Real Value — Here’s Why CNBC. Loading... Autoplay When autoplay is enabled, a suggested video will automatically play next. Up next How the blockchain is changing money ... We all know Bitcoin is a roller coaster of price changes, but have you ever wondered what determines the value of Bitcoin? Today Maria walk you through how the value of bitcoin constantly changes ... The current Bitcoin price volatility is on a yearly low. What did this mean for the Bitcoin price in the past? Also in this episode: Generation Z is starting... Close. This video is unavailable. There are some important signals on a chart of bitcoin that very few people are watching right now. What does it mean for BTC? We explain in this video. For more on Bitcoin visit: https://www ...

#