Browse Category

eth

ChainLink launches Mainnet to get data in and out of Ethereum smart contracts

Click here to view original web page at www.zdnet.comThis device is unable to play the requested video.
Blockchain may be one of the most promising technologies today, but that may just as well be the reason why there’s also a lot of FUD around it. Speculation and crypto-winter aside, however, there’s a number of technology issues to address before blockchains can get real, and data access is prominent among them.
In a nutshell, blockchains are not very efficient as a data storage and retrieval mechanism. This is why people have been experimenting with various approaches to use blockchains as a database, including altering its structure.
Regardless of how successful these turn out to be, however, one thing is certain: Most of the world’s data today does not live on a blockchain. The vast majority of application data live in some database, and some of that data may be accessed via APIs.
How, and why, would the world of databases and APIs talk to the world of blockchain? Enter ChainLink.
Smart contracts and the connectivity problem
You may have heard about smart contracts. You can think of smart contracts as programs that execute exactly as they are set up to by their creators on the Ethereum blockchain. Smart contracts enhance Ethereum with the ability to execute tamper-proof code, in addition to storing tamper-proof data, turning it to a "world computer."
Together, smart contracts and data form the building blocks for decentralized applications (Dapps) and even whole decentralized autonomous organizations (DAOs). There is a programming language (Solidity) used to develop smart contracts, as well as a development framework (Truffle) that can be used to build smart contract applications.
Despite the fact that this is still not a 100% mature stack, people are using it to develop Dapps and DAOs. Smart contracts can interact with each other, and they can also store and retrieve data on the blockchain. But what happens when they need to interact with the outside world, and retrieve (or store) data from/to databases or APIs?

The Smart Contract Connectivity Problem, as ChainLink defined it, is the inability of a smart contract to interact with any external data feed or other resource that is run outside the node network in which the smart contract itself is executed.
This lack of external connectivity is inherent to all smart contract networks, due to the method by which consensus is reached around blockchain transactions, and will therefore be an ongoing problem for all smart contract networks.
ChainLink, co-founded by CEO Sergey Nazarov and CTO Steve Ellis, aims to solve this problem by developing a so-called oracle, officially launching today. ZDNet connected with the ChainLink team to discuss what this is all about.
ChainLink, the blockchain oracle
An oracle is a gateway between a blockchain and the real world. Oracles can get data off the blockchain and pass it on to smart contracts. The problem with this, of course, is that oracles introduce the need for centralization and trust in the decentralized, trust-less world of blockchains.
ChainLink’s whitepaper, published in 2017, tries to address this on the technical level. Part of ChainLink’s implementation runs on-chain and part off-chain. There are provisions for Service Level Agreements (SLAs), mechanisms for data source selection, result aggregation, and reporting.
There is an API data providers can use to feed their data in ChainLink’s oracle. There are also decentralization approaches and security services outlined, to ensure that ChainLink is robust and secure. One of the things we inquired about was how close today’s launch is to the vision outlined in the ChainLink whitepaper [PDF].
The smart contract connectivity problem: how do smart contracts interoperate with data and APIs beyond the blockchain? Image: ChainLink
The ChainLink team noted that the initial launch is focused on allowing smart contracts to retrieve external data from ChainLink nodes based on the number of individual requests they create. While this is an essential first step, it does not fully implement all of the features discussed in the white paper. ChainLink believes that’s a process that can and should be gradually upgraded as development progresses.
In order to assist smart contract creators today, they went on to add, ChainLink provides documentation and contract examples on how to create requests to multiple oracles and aggregate responses. The Service Agreement Protocol, currently in development, will allow a requester to define parameters for their requests in a setup step, such that a single request can receive responses from multiple oracles.
In other words, there is a certain degree of technical forethought that has been built into this, although it’s not fully implemented yet. Part of it is there to ensure the oracle is resilient (i.e. it does not crash under heavy load), and part of it to ensure it’s decentralized (i.e. there’s no single point of failure/arbiter of the truth).
Building an ecosystem
ChainLink is launching with three endorsed oracles, including its own. The other teams are Fiews and LinkPool. These teams have been running a ChainLink node on the Ethereum test networks for around a year, and have assisted with the development of the ChainLink node. ChainLink noted they will also have an on-boarding process for endorsed ChainLink nodes to be listed in official documentation.
Other third parties are able to run ChainLink nodes themselves, as ChainLink code is open source. Third parties may use other listing services (currently in development) in order to receive requests from smart contracts.
Any service provider can use ChainLink oracles for their smart contracts. If someone wants to use their own data for their smart contracts, they are free to connect to their own data source. Furthermore, the ChainLink team added, this depends on your perspective:
ChainLink is not just providing the infrastructure to help the development of smart contracts, but also building an ecosystem around this itself. Image: ChainLink
"As a data provider, how do I sell my data to smart contracts? The answer is to create an external adapter for my API, run a ChainLink node, and allow smart contracts to create requests to my oracle. As a general node operator, how do I sell data for X API? They would either need to create an external adapter themselves, which may not be viable if they’re not a developer (which is not a requirement), or they can find an open source implementation of an external adapter for the API they’re wanting to provide. We’ve built the ChainLink node to be modular by-design, so external adapters can easily be added by node operators to extend the functionality of their node without needing to know how to write programs."
The ChainLink ecosystem, today and tomorrow
Part of the value ChainLink brings is by providing the infrastructure for anyone to run an oracle, and part of it comes from its own oracle and ecosystem. There have been various names flying around, including a proof of concept project with SHIFT, and alleged "white label" partners such as Salesforce and Microsoft Azure.
The SHIFT proof of concept pulled interest rates from five banks (Barclays, BNP Paribas, Fidelity, Societe Generale, and Santander) and fed the data into a smart contract, which was used to make a payment that translated into a SWIFT payment message.
ChainLink clarified there are three types of projects that the ecosystem: Data providers, platforms/blockchains, and projects that use ChainLink oracles. Although ChainLink refrained from pointing to a comprehensive list, they pointed to a Decrypt article which mentions many collaborators and projects. There is a lot of speculation in the industry, they added, and they only confirm when official.
ChainLink provides more than the technical infrastructure here — they also provide an instance of this infrastructure, with vetted data providers onboarded. ChainLink emphasized that they work with top data partners for officially created adapters such as crypto price data, supply chain, etc.
Essentially, there are two layers of selection there: One on the oracle network, and one within each oracle. Users can choose which oracle(s) to use in the oracle network, and oracle nodes can choose which external services to connect to.
This also poses some interesting technical challenges. Essentially, oracles will act as data hubs, with data flowing in and out of them. How will the different data providers and data streams be cataloged, integrated, and managed? And what about issues related to data freshness, correctness, and performance?
Data selection and schema matching
ChainLink currently operates with a schema system based on JSON Schema, to specify what inputs each adapter needs and how they should be formatted. Similarly, adapters specify an output schema to describe the format of each subtask’s output.
Schema management at scale with data coming from various domains and sources is a sufficiently researched and documented topic, but that does not make it easy to deal with in practice. Especially when using JSON Schema, which is not the most advanced solution when it comes to schema management.
ChainLink has been used in a proof of concept to clear payments with SHIFT and global banks. Image: ChainLink
So what happens when there is no sufficient metadata on the data flowing through ChainLink? Not to mention, even sufficient metadata can be erroneous / misleading. What happens if i connect a data provider and claim it’s about topic A, but others say it really is about topic B, or C, or D and E? ChainLink says this is where decentralization plays a key role in the oracle problem:
"Just like how smart contracts are secure because they’re ran on multiple machines (blockchain nodes), you can secure the inputs to your smart contracts by having that input retrieved by multiple ChainLink nodes. So if you’re a requester, and you want data from a particular API DPA, you define how many ChainLink nodes you want to retrieve that data. To further decentralize your inputs, and if there are additional data providers with the same topic of data, you could have additional ChainLink nodes retrieve from another API DPB to assist with validation."
However, we would argue that while that does indeed address the topic of data source selection, it does not address that of schema matching: The terms used to describe what DPA and DPB is about could be different, and yet their data could be about the same thing. Based on JSON Schema, without a mechanism to align the metadata in place, nobody would ever know.
Data flow
From a data architecture perspective, ChainLink looks like a data hub through which data will flow transiently. However, a published list of use cases, interacting with databases and data in the cloud is mentioned.
We wondered whether there are implementations of such use cases to show for today. Plus, if this takes off, the amount of data flowing through ChainLink will be considerable. Would ChainLink consider storing any of that data in the oracle, for example for caching?
ChainLink’s view is that they like to think of it as an on-chain protocol that allows smart contracts and node operators to work with one another in a trust-minimized way:
The ChainLink oracle acts as a data hub, enabling data source selection on 2 levels: oracle and external data sources. Image: ChainLink
"This means that any endpoint that a node operator can access can be used by a smart contract through our protocol. We have a number of working implementations that give smart contracts the ability to retrieve data from authenticated data sources. Storing or caching data within the oracle is not currently a consideration since there are a number of security concerns associated with that. Data providers already have the facilities to store data long-term, and have the history and reliability of providing that data."
And what about the other way round? If a smart contract wants to send data to an external source, rather than store it on the blockchain, can ChainLink do this?
A ChainLink node can relay information from a smart contract to an external source. However, this would introduce an array of issues, as storing data in an external system means the tamper-proof aspect of data storage on the blockchain no longer applies.
Conclusion
So how will development for smart contracts on ChainLink look like? Does it come down to writing Solidity — which is not the easiest thing in the world for most people? Currently smart contracts create their requests from on-chain, and that request is picked up by the ChainLink node.
In the near future, ChainLink said, they will allow for requests to be initiated from off-chain services directly to a ChainLink node. This allows for requests to be created faster than the typical block time of the Ethereum network.
It also opens the door for faster blockchains to receive data at their native speed. ChainLink nodes can already query data on other blockchains with external adapters, the only caveat is a requester would need to use a ChainLink node with connectivity to that blockchain.
All in all, this is a much welcome development for smart contracts, Ethereum, and blockchain at large. It means the next step in the evolution of this ecosystem is now possible.
Granted, not everything is rosy, and smart contract and oracle development is bound to hit some of the same issues tantalizing software development and data management for decades. Hopefully known solutions to those issues can eventually be applied to foster the growth of this ecosystem, too.
Blockchain may be one of the most promising technologies today, but that may just as well be the reason why […]
Click here to view full article

Ethereum: He Paid $113K For Virtual Token of Formula 1 Race Car

Click here to view original web page at www.investinblockchain.comIn what seems to be an unbelievable purchase, a pseudonymous blockchain bidder named “09E282” has apparently paid over $113,000 in Ether (ETH) for a non-fungible token (NFT) representing a virtual Formula 1 race car, according to data from Etherscan and as reported by CoinDesk.
The virtual car was sold in a 4-day auction that included 15 bidders with 40 competing bids and ended on May 27.
The virtual Formula 1 race car was created for an unreleased racing game called F1 Delta Time and was made by Animoca Brands, a global developer and publisher of games and other apps. As reported by CoinDesk, F1 Delta Time is a blockchain-based game that bears the official seal of approval from Formula 1 races.
Was This Purchase a Marketing Ploy?
Spending $100,000 on a virtual item for a game that hasn’t even been released yet is strange, to say the least. It’s hard to believe that anyone, even a wealthy F1 fan, would buy this crypto collectable before the game has yet to be released.
As reported by CoinDesk, someone from GTPlanet, a popular online community dedicated to the Gran Turismo racing game series, said:
“Why would anyone spend that much money on a virtual car in an unreleased racing game that few people know anything about? While NFT-based cryptocurrency games like F1 Delta Time are interesting and exciting new ways to use blockchain, this investment is so outrageously bizarre that it seems almost suspicious.”
Moreover, crypto influencer and self-proclaimed thought leader Richard Heart doesn’t believe this was a real purchase.
Joking via Twitter, he said he recently bought his own left sock for $100,000:
I just bought my left sock from myself for $100,000.
Heart was referring to the likelihood that whoever bought this virtual F1 racer was already the owner of it in the first place. If true, this may have been a big marketing ploy for the upcoming release of the F1 Delta Time racing game.
Do you think somebody actually bought this virtual car or did they just buy it from themselves? Let us know what you think in the comment section below.

Tron’s [TRX] DApp prowess increases in May as Ethereum and EOS are left in the dust

Click here to view original web page at ambcrypto.com
The cryptocurrency industry has seen multiple sectors develop over the past couple of months with Decentralized Applications [DApps] being a hot commodity among specific organizations.
The Justin Sun-led Tron Foundation has been one of the major organizations involved in spearheading the DApp revolution with the currently 12th raked cryptocurrency company possessing more DApps than its closest competitors, EOS and Ethereum [ETH]. This lead in the DApp department was also reflected in the Foundation’s latest tweet which said:
“According to @dapp_review, as of May 27, new #Dapps on #TRON #Blockchain accounts for 31.4%, which is the fastest growing one compared with new #Dapps of other chains in last week. #TRX $TRX”
The DApp review conducted showed that if the number of new chains was counted, then Tron had 16 DApps while Ethereum launched 15 and EOS only had 5 DApps. Many DApp analysts have pointed out that games are the most popular category on the DApp list and that was again evidenced when 20 gaming Daps were launched compared to the 6 gambling DApps.
The field of DApps has seen a phenomenal boom when it comes to usage as another report showed that Tron DApps actually saw more than 100,000 active users utilizing the applications. This happened at the same time that the coin overtook Ethereum and EOS transactions.
Misha Lederman, a popular Tron proponent had said:
“100k users is a major achievement for TRON since TVM launch Oct 2018, but will be a blimp on the chart in the coming months as more sophisticated DApps attract #TRX users in the millions.”
Tron was also in the news recently when it surpassed 3 million accounts in less than a year. The number of accounts/addresses had clocked at 3,004,564 with the month of May contributing to an increase of accounts anywhere between 9000 and 17,000.
The Justin Sun-led Tron Foundation […]
Click here to view full article

iExec to Work with France’s Largest Utility Company to Streamline Infrastructure with Ethereum App

Click here to view original web page at www.newsbtc.comDecentralised cloud computing project iExec has announced a partnership with one of the world’s largest utility companies. The French energy giant EDF hopes to overhaul its cloud-based infrastructure by building an application on the Ethereum blockchain.
GPUSPH will reportedly take advantage of “the decentralized cloud”. The app gets around Ethereum’s scalability issues by ensuring that any heavy computing is done off-chain.
EDF to Build on Ethereum to Optimise Infrastructure
The fifth largest utility company on earth has announced a partnership with the decentralised cloud computing firm iExec. The two have built out an application known as GPUSPH and deployed it on the Ethereum blockchain.
According to a press release by iExec, GPUSPH will be used to model fluids used by the energy supplier. The research aims to further optimise how dams are constructed as well as lava cooling techniques.
EDF is France’s largest and the planet’s fifth largest utility company.
The release goes on to detail how iExec will be used to address the shortcomings of Ethereum with regards scalability. It states that the “heavy computing… is done off-chain and does not overwhelm Ethereum.”
The Ethereum blockchain is then used to find a consensus on the validity of results and a hash is stored to the blockchain.
EDF reportedly stands to benefit from increased network resilience, performance, and transparency by choosing to work using the blockchain solution provided by iExec.
Gilles Deleuze, the blockchain engineer at EDF stated the following of the new partnership:
“In a wider perspective, the development of distributed computing is a credible scenario for the future, and blockchain may be a nice lever in this scenario. The plan is to continue with other open scientific codes requiring possibly other types of worker pools.”
The release goes on to state that this is the first of many proposed experiments that the two companies will work on.
Has the iExec Token Price Responded Favourably to the Partnership?
Despite having no discernible use other than to pay for the services of the company itself, iExec has its own token. It was launched via initial coin offering (ICO) in April 2017.
Although spending much of 2019 in a gradual ascent, the last couple of week have seen the price bleed from a yearly high of over 88c down to just less than 48c at the time of writing.
With such a large partnership being announced, you would have expected iExec RLC (RLC) to be one of the best performers of the day on overall green day across the market. However, this has not been the case. Apparently, it will take more than a partnership with one of the planet’s largest energy suppliers for iExec bag-holders to finally get to off load their holdings.
Tags:ethereumfranceiExec Play BTC Games, ETH Games, USDT Games, BCH Games, LTC Games

Ethereum Classic May Delay Upcoming Hard Fork ‘Atlantis’

Click here to view original web page at www.coindesk.com
Ethereum classic’s open-source developer team failed to reach a consensus Thursday on whether to move forward with a forthcoming system-wide code upgrade as outlined, effectively sending the planned batch of upgrades back to a drafting stage.
Developers have been deliberating on a set of 10 proposals to be integrated into the protocol since February, an upgrade colloquially called “Atlantis.” A continuation of the original ethereum blockchain, ethereum classic (ETC) effectively broke away from the project in 2016, subsequently rising to a near $1 billion valuation, according to CoinMarketCap.
Still, while ethereum classic has strived to carve out a unique value proposition (based on an altered monetary policy among other differences), its community has also been making greater efforts to introduce changes to the network that would make interoperability between the two blockchains easier.
In fact, Atlantis is the first of two protocol upgrades or hard forks aimed at incorporating the EIPs that have already been activated on ethereum in recent years.
“These upgrades would bring ETC up to date with ETH’s latest protocol, making migration of dapps between the networks much easier,” wrote Bob Summerwill, the executive director of Ethereum Classic Cooperative in an email newsletter back in May.
Today, it was expected that the community would come to a final decision about the contents of the upgrade and its planned activation for mid-September. However, some developers expressed hesitation about including one particular proposal – EIP 170 – into the Atlantis upgrade.
Summarizing his thoughts on the proposal in a GitHub comment, ethereum classic developer Anthony Lusardi wrote:
“These rules can simply be applied to transaction validation rather than block validation, making it a soft fork rather than a hard fork… It’s vitally important to stick to pre-agreed rules when they’re defined.”
EIP 170
As background, EIP 170 if implemented would put a fixed cap on the size of smart contract code that can be run in a single transaction. This idea was originally conceived by ethereum founder Vitalik Buterin who explained at the time that a cap was necessary to prevent certain attack scenarios on the blockchain.
However like Lusardi, ethereum classic community member “MikO” argues this change doesn’t have to be a hard fork (i.e. backwards incompatible).
“I don’t like the idea of having to change a hard limit in the future if we desire more complex contracts,” wrote MikO on the ethereum classic Discord channel.
At the same time, both Lusardi and MikO emphasize their disagreements with EIP 170 should not delay or in any way obstruct the progression of the Atlantis upgrade.
MikO highlighted:
“If everyone feels setting this limit in this way is the way to proceed, then I agree with the majority.”
Lusardi added that outside of not wanting to delays the Atlantis upgrade, he doubly doesn’t believe that “any one person should be able to halt the [upgrade] process.”
For now, no decision has been made on the timeline or content of the Atlantis upgrade as a result of the comments shared in today’s developer call.
“Let’s just acknowledge there’s discussion around EIP 170 and take this time, another one or two weeks [to discuss] what’s the problem with a maximum code size limits and how to move on,” concluded ethereum classic developer soc1c.
Ethereum classic image via CoinDesk archives

State Farm, USAA to Pay Each Other Insurance Claims on Blockchain by 2020

Click here to view original web page at www.coindesk.com
U.S. insurance giants State Farm and USAA have entered advanced testing of a blockchain to automate the time-consuming and paper-heavy processing of automobile claims.
Announced Thursday, the companies developed this system using Quorum, the private enterprise version of ethereum created by JPMorgan Chase.
State Farm and USAA began working on the joint platform (which is yet to be named) in early 2018. State Farm disclosed the project was in early trials in December, but kept USAA’s involvement under wraps at the time.
Now, the firms are using real claims data and expect to go into production around the end of this year.
The ledger shared by the two companies is designed to replace existing systems for subrogation. This is typically the last phase of the claims process when one insurance company recovers claim costs it paid to its customer for damages from the at-fault party’s insurance company.
In simple terms: if Bob dents Alice’s car, Alice’s insurer pays her to repair it and then bills Bob’s insurer.
Today, these business-to-business payments are handled consecutively, often requiring paper checks to be mailed on a claim-by-claim basis between insurers. Subrogation payments totaled about $9.6 billion across all U.S insurance carriers last year, according to Mike Fields, an innovation executive at State Farm.
To streamline this process, the new blockchain nets out the balance of these payments (which run into the thousands each month) and facilitates a single payment on a regular basis between insurers.
Fields told CoinDesk:
“For now, it’s two companies on the distributed ledger but we could have many carriers participating. The blockchain keeps track of every individual instance and these can be netted at different intervals; weekly, monthly, whatever time frame you want.”
Blockchainers in the public space working with cryptocurrencies and tokens might think this is a relatively modest innovation, but Ramon Lopez, vice president of innovation at USAA, emphasized the challenge of simply getting two large competing firms to collaborate, adding:
“I think innovation is hard already, but the ability for two very large companies like State Farm and USAA to partner their innovation teams to bring something to light is worth mentioning as well.”
Why Quorum?
Quorum is a somewhat surprising choice of platform for State Farm and USAA.
That’s because both are members of the RiskBlock Alliance, the insurance blockchain consortium, which last year announced R3’s Corda platform would be its distributed ledger technology (DLT) of choice.
The re-insurance blockchain consortium B3i has also opted for Corda, suggesting an interoperability play across the insurance industry.
According to State Farm’s Field, Quorum’s focus on data privacy was a factor in the decision.
“When we were starting out on this journey, we looked at them [DLTs] all. We wanted a private permissioned network that we could invite others to join with strong privacy and security.”
State Farm and USAA did look at Corda, Field said, adding that it too would have been a suitable platform.
“Corda has a lot of good attributes,” he said.
Lopez stressed that both firms had embarked on a learning curve involving four phases of testing.
“We are at phase two right now and continue to learn from this technology and from one another,” he said.
Field said both State Farm and USAA are actively participating in the consortium use cases being built at RiskBlock, but hinted at the need to get out there and innovate, concluding:
“We just saw this as an opportunity to go faster with a narrower focus.”
Announced Thursday, the companies developed this […]
Click here to view full article

Ethereum 2.0: The Roadmap to More Scalable Experience

Click here to view original web page at appinventiv.comEthereum 2.0 The Roadmap to More Scalable Experience
Ethereum 1.0 is getting upgraded.Ethereum 2.0 is going to be the new face of dApp industry because of its speed, scalability, cost-effectiveness, and other such benefits.Ethereum 2.0 will be ready within 18-24 months, in the form of 7 different phases.Ethereum 2.0 Phase 0 is expected to be live this year.
Ever since Ethereum was launched back in 2015, the developers were having sky-high hopes from it. While Buterin and Co., the company behind the evolution of this blockchain-based distributed computing platform, made significant changes in its consensus model and scaling solutions, the developer’s demand for an integrated experience of all these changes was not yet fulfilled.
But this Monday, the history of the Ethereum platform changed. The company announced an OS upgrade, known as Ethereum 2.0 (Serenity) – a glance of which we are going to cover within the next 3 seconds.
Ethereum 2.0: What It Is
Ethereum 2.0, according to Van Loon, is a distinct Blockchain from the existing Ethereum Chain, where the hard fork of the current blockchain is not mandatory for proper functioning. Instead, the value in Ethereum 2.0 is transmitted from ‘Proof of Work’ chain through a one-way deposit Smart contract.
With this attended to, let’s have a look into the reason behind the idea of launching this upgrade, or better say, have a comparison of Ethereum 1.0 and Ethereum 2.0
Ethereum 2.0 vs Ethereum 1.0: What Everyone Ought to Know
When it comes to comparing the two OS versions, the reasons that come up as the igniting force behind the introduction of Ethereum 2.0 are the following challenges associated with current Ethereum:-
Scalability:- Ethereum was launched with an aim to be the world computer that manages all the financial transactions and host dApps and Smart contracts without being impractically slow. However, Ethereum 1.0 is not able to fulfill this requirement while operating with PoW (Proof-of-Work) algorithm – something that gives the scope of introducing other more scalability-friendly platforms in the many Blockchain guide meant for entrepreneurs.Security:- Though not a major issue, the security level and considerations associated with Ethereum 1.0 are not advanced. They have to be improved, which is what Ethereum 2.0 is focusing upon.A Solution for Difficulty Bomb:- The developers have been continually compelled to shift from PoW to PoS by slowing down the mining rewards. However, this is increasing the difficulty associated with the process, and in the absence of any solution, it has been resulting in a dead end. Ethereum 2.0, in this case, will come up as a solution for the dApp developers to make better applications.
Now as we are familiar with the secret behind launching Ethereum 2.0, let’s dig deeper into what includes in this upgrade and when it will be live.
Ethereum 2.0 denotes a series of updates that will make Ethereum make better and faster by focusing on two prime goals:-
Introducing PoS (Proof of Stake) consensus mechanism that will eventually eradicate the need to invest in PoW (Proof of Work) mining.Introducing Sharding which will boost speed and throughout of the ETH transactions.
Now, when talking about the series of updates, the Ethereum 2.0 is making the update live in different phases. An outcome of which is that the 7 phases of the evolution of Ethereum 2.0 is expected to hit the market – with the Phase 0 just gone live.
Wondering what these different phases are? What will be included in each phase and when are they supposed to be made available to developers? Let’s cover this in the next section of the blog.
Different Phases of Ethereum 2.0
Phase 0: PoS Beacon Chain
The Beacon Chain is a PoS-enabled chain that will run in parallel to Ethereum’s Proof of Work chain and enable a Blockchain dapp development company to reap the benefits of the network without investing their time and energy into the process of re-learning the parameters of the platform. It is estimated to enter the market this year itself.
Phase 1: Basic Sharding
In this phase, shard chains will work in sync with the Beacon chain. They will aid developers with higher transactional speed and instant output delivery in transactions, which will eventually upgrade the scalability.
Shard chains will be responsible for managing transactions and exchange of account data and will be live in the world in 2020.
Phase 2: EVM State Transition Functioning
Proposed to enter the market in 2020-2021, this phase of Ethereum 2.0 Serenity will be related to the advent of new EVM (Ethereum Virtual Machine) which will be upgraded via the eWASM (Ethereum Web Assembly). This new virtual machine is predicted to perform code execution more swiftly and effectively while supporting many more programming language.
Besides, this phase will also witness the introduction of better protocol standardization to enhance the network security.
Phase 3: Light Client State Protocol
The fourth phase of evolution of Ethereum 2.0 will begin in 2022 and will cover everything related to the improvement of network in terms of security, scalability, and decentralization.
Phase 4: Cross-shard Transactions
This phase will be basically related to mind mapping of the complete architecture and will be seen somewhere around 2022.
Phase 5: Tight Coupling with Main Chain Security
The sixth phase of Ethereum 2.0 Serenity will be associated with internally fork-free sharding and data availability proofs.
Phase 6: Super-Quadratic or Exponential Sharding
The last phase of Ethereum 2.0 (Serenity), which will go live by the end of the year 2022, will be related to managing recursive shards.
The process of evolution of Ethereum 2.0, with 7 seven phases is announced to be completed within 18-24 months. This implies we will be able to enjoy 100 times more scalable network by the year 2022 along with other facilities like transition of Ether tokens from old chain to new one.
Continue exploring the landscape of product design with these helpful resources:
Ethereum 2.0 is going to be the new face of dApp industry because of its […]
Click here to view full article

4 key Notes as the KIN Token Migration to Bancor Finalizes

Click here to view original web page at themerkle.com
When the Kik team announcement their initial coin offering a while ago, many people had high expectations. A well-known and respected messaging service issuing its own tokens could introduce a lot more people to the cryptocurrency industry. In the next few weeks, all users must migrate their KIN tokens from Ethereum to Bancor. This move has some very interesting potential consequences.
Moving Away From Ethereum
The biggest development to take note of is how the Kik team has made it rather clear they do not want to use Ethereum’s infrastructure for their token. The decision to switch to Bancor is rather interesting, albeit not all that surprising. Numerous other projects have moved away from Ethereum in search of greener pastures. Whether or not those decisions will pan out as expected, is a very different matter altogether.
With the migration to Bancor now almost completed, one can safely say the Kin token no longer has anything to do with Ethereum come June 15. As of right now, there is still an ERC20 relay active to swap KIN to the Bancor-based token accordingly. Once fully completed, the wait begins to determine if Bancor can live up to the Kik team’s expectations in terms of sustainability and scalability.
Manually Migrating ERC20 Tokens is Pertinent
Contrary to what most users might expect, the switch from ERC20 to Bancor tokens will not occur automatically. Users are advised to either use a swap service such as CoinSwitch or Changelly, or perform this course of action through an exchange. The swap services should complete this process in 30 minutes or less, which might be the more approachable option for KIN holders.
Several exchanges have also supported this migration since March of 2019. That list includes HitBTC, CoinTiger, LAToken, and a few others. However, it seems most of the “windows” for exchanges have closed already, as this swap was announced several months ago. Using the swapping service or the ERC20 relay is still a viable option at this time. Ensuring tokens are converted sooner rather than later is the best course of action.
Finding the Right Wallet
Sorting any cryptocurrency, token, or asset is always a matter of conducting proper research. For Kin holders, moving the funds to a Bancor-based wallet can be done when using either the Ledger or Atomic Wallet, as well as the Freewallet solution. All of these platforms support the old and new token at this time, which should make it relatively easy to generate a new address to receive the correct tokens.
Another option is to use Bancor’s own Smart Wallet, which allows users to support all ERC20 and EOS tokens in existence today. By default, this also means the new Bancor-based tokens will be supported, as this integration was completed in late 2017. There are plenty of options for users to look into in this regard, albeit putting in some effort is to be expected at this time.

Boosting KIN’s Popularity on Bancor
As this token swap will enter the final stages, it is not unlikely KIN will overtake some other tokens issued on Bancor in popularity, albeit briefly. It is rather interesting to take note of how many tokens are currently running on top of Bancor’s infrastructure. This list is a lot longer than most people might assume, although it is evident Ethereum remains the undisputed leader in this regard.
Until Ethereum can successfully address the scaling concerns affecting the network, it seems likely there may be a few more migration efforts in the months and years to come. While it is a popular platform to issue ICO tokens, it seems things will get rather interesting in the coming months and years. For KIn users, not too much will change in terms of using the coin. In terms of which features and use cases may be unlocked in the future, one never knows what may come next.
Disclaimer: This is not trading or investment advice. The above article is for entertainment and education purposes only. Please do your own research before purchasing or investing into any cryptocurrency or digital currency.
Image(s): Shutterstock.com

Mike Novogratz to Elon Musk: Tokenize SpaceX!

Click here to view original web page at www.livebitcoinnews.com
Hedge fund manager Mike Novogratz thinks that Elon Musk, the CEO of SpaceX, should release the company’s stock on a cryptocurrency network. In other words, he thinks the company’s shares should come only in crypto form. He also says he’d be the “first” to invest in the company if Musk ever made that decision.
Elon Musk and Crypto Don’t Always Go Together
Musk has had a very odd relationship with crypto, to say the least. For one thing, he “temporarily served” as the CEO of Dogecoin, a cryptocurrency that ultimately started as a joke and eventually rose to enjoy a market cap of more than $1 billion in USD. While this didn’t last long, it sure had traders and enthusiasts thinking the sky was the limit.
In addition, Musk was also the target of several Ethereum bot scams on Twitter, in which hackers ultimately created fake profiles using the CEO’s likeness to get people to voluntarily donate their ether funds to hacker-controlled accounts.
Musk’s SpaceX recently enjoyed the launch of 60 new satellites. The items are heavy – roughly 500 pounds each. This is so they can orbit Earth at a much lower altitude. This allows them to provide much cheaper internet service to the planet’s many residents, which was the company’s primary goal with the launch. In all, the company plans to launch more than 11,000 total satellites in the coming months. It’s an ambitious feat, to say the least, but one that has many potential benefits.
The company also garnered roughly $1 billion in additional monies through a new funding round. Whatever isn’t used towards the satellite project will go to the company’s next big venture known as Starship, which involves building a giant rocket designed to ship both humans and assorted cargo to the planet Mars.
Novogratz has been a proponent of tokenization for many years. The process involves an enterprise issuing stock stares on a blockchain network, meaning all shares would be digitized. He’s certain the process could ultimately “upend” Wall Street and put more financial control into the hands of average people rather than money-minded bankers.
Would This Really Work?
It’s unclear, however, if this process would work as well as he claims it would. For one thing, everyday persons do not have the education that Wall Street players have in managing funds, nor do they have the experience of company chief executives. It’s hard to say how the situation would go; companies would either thrive to the hilt, or all come crashing down in melodramatic fashion.
Either way, Novogratz believes Musk should start thinking about the tokenization of SpaceX. He’s confident it could give rise to an entirely new wave of investors, and with all the bullish activity surrounding crypto as of late, he could very well be right.

Academic credential verification startup TrustED taps Binance’s blockchain platform

Click here to view original web page at www.tokenpost.comMon, 27 May 2019, 11:40 am UTC
Image: Shutterstock
TrustED, an Adelaide-based academic credential verification startup, has entered into an agreement with leading cryptocurrency exchange Binance to utilize Binance Chain.
Binance Chain, Binance’s public blockchain platform, was launched in April following the public testnet phase which commenced in February 2019.
Founded in 2017, TrustEd aims to offer technology and training to educators to help them store, issue, and verify academic credentials such as diplomas and certificates using blockchain technology. The objective is to digitize credentials and thwart the creation and issuance of fraudulent and falsified documents.
Be your own voice. Start your own newspaper now.
While the startup initially intended to utilize the Ethereum blockchain for its application use case, it has announced that it will use Binance Chain to realize its goal. With this, it has become one of the first startups to use Binance’s blockchain platform.
TrustEd noted that with one second block times and near-instant confirmation of transactions, Binance Chain is “poised to be a revolutionary stepping stone in the bid to bring cryptocurrencies and blockchain technology to the masses.”
"Being one of the first projects on Binance Chain is not only an honor but also a massive stepping stone for the TrustED project. With Binance technology behind us, TrustED can deliver on SLAs and security requirements necessary to make a blockchain-based academic solution enterprise-grade,” Kosta Batzavalis, TrustED CEO, said.
According to the official release, TrustED will be among the very first tokens to be launched on Binance Chain. TrustEd also plans to conduct a public token offering for its fundraising and community building efforts, which would be the first Initial Token Offering to take place with the native Binance Chain BEP2 Token standard.
"Binance Chain and the introduction of the Binance DEX enables thousands of crypto tokens and companies to utilize the technology in an efficient and effective manner,” Ted Lin, Chief Growth Officer at Binance stated. “We’re excited to have TrustED be one of the first startups to utilize Binance Chain and look forward to the growth that is to come in further bringing cryptocurrency mainstream."
Last week, Verasity, a digital currency for online video players, also partnered Binance Chain as part of its efforts to bring about a new incentivised video economy.
TrustED , an Adelaide-based academic credential verification startup, has entered into an agreement with leading cryptocurrency exchange Binance to utilize […]
Click here to view full article