Browse Category

ethereum blockchain

ChainLink launches Mainnet to get data in and out of Ethereum smart contracts

Click here to view original web page at www.zdnet.comThis device is unable to play the requested video.
Blockchain may be one of the most promising technologies today, but that may just as well be the reason why there’s also a lot of FUD around it. Speculation and crypto-winter aside, however, there’s a number of technology issues to address before blockchains can get real, and data access is prominent among them.
In a nutshell, blockchains are not very efficient as a data storage and retrieval mechanism. This is why people have been experimenting with various approaches to use blockchains as a database, including altering its structure.
Regardless of how successful these turn out to be, however, one thing is certain: Most of the world’s data today does not live on a blockchain. The vast majority of application data live in some database, and some of that data may be accessed via APIs.
How, and why, would the world of databases and APIs talk to the world of blockchain? Enter ChainLink.
Smart contracts and the connectivity problem
You may have heard about smart contracts. You can think of smart contracts as programs that execute exactly as they are set up to by their creators on the Ethereum blockchain. Smart contracts enhance Ethereum with the ability to execute tamper-proof code, in addition to storing tamper-proof data, turning it to a "world computer."
Together, smart contracts and data form the building blocks for decentralized applications (Dapps) and even whole decentralized autonomous organizations (DAOs). There is a programming language (Solidity) used to develop smart contracts, as well as a development framework (Truffle) that can be used to build smart contract applications.
Despite the fact that this is still not a 100% mature stack, people are using it to develop Dapps and DAOs. Smart contracts can interact with each other, and they can also store and retrieve data on the blockchain. But what happens when they need to interact with the outside world, and retrieve (or store) data from/to databases or APIs?

The Smart Contract Connectivity Problem, as ChainLink defined it, is the inability of a smart contract to interact with any external data feed or other resource that is run outside the node network in which the smart contract itself is executed.
This lack of external connectivity is inherent to all smart contract networks, due to the method by which consensus is reached around blockchain transactions, and will therefore be an ongoing problem for all smart contract networks.
ChainLink, co-founded by CEO Sergey Nazarov and CTO Steve Ellis, aims to solve this problem by developing a so-called oracle, officially launching today. ZDNet connected with the ChainLink team to discuss what this is all about.
ChainLink, the blockchain oracle
An oracle is a gateway between a blockchain and the real world. Oracles can get data off the blockchain and pass it on to smart contracts. The problem with this, of course, is that oracles introduce the need for centralization and trust in the decentralized, trust-less world of blockchains.
ChainLink’s whitepaper, published in 2017, tries to address this on the technical level. Part of ChainLink’s implementation runs on-chain and part off-chain. There are provisions for Service Level Agreements (SLAs), mechanisms for data source selection, result aggregation, and reporting.
There is an API data providers can use to feed their data in ChainLink’s oracle. There are also decentralization approaches and security services outlined, to ensure that ChainLink is robust and secure. One of the things we inquired about was how close today’s launch is to the vision outlined in the ChainLink whitepaper [PDF].
The smart contract connectivity problem: how do smart contracts interoperate with data and APIs beyond the blockchain? Image: ChainLink
The ChainLink team noted that the initial launch is focused on allowing smart contracts to retrieve external data from ChainLink nodes based on the number of individual requests they create. While this is an essential first step, it does not fully implement all of the features discussed in the white paper. ChainLink believes that’s a process that can and should be gradually upgraded as development progresses.
In order to assist smart contract creators today, they went on to add, ChainLink provides documentation and contract examples on how to create requests to multiple oracles and aggregate responses. The Service Agreement Protocol, currently in development, will allow a requester to define parameters for their requests in a setup step, such that a single request can receive responses from multiple oracles.
In other words, there is a certain degree of technical forethought that has been built into this, although it’s not fully implemented yet. Part of it is there to ensure the oracle is resilient (i.e. it does not crash under heavy load), and part of it to ensure it’s decentralized (i.e. there’s no single point of failure/arbiter of the truth).
Building an ecosystem
ChainLink is launching with three endorsed oracles, including its own. The other teams are Fiews and LinkPool. These teams have been running a ChainLink node on the Ethereum test networks for around a year, and have assisted with the development of the ChainLink node. ChainLink noted they will also have an on-boarding process for endorsed ChainLink nodes to be listed in official documentation.
Other third parties are able to run ChainLink nodes themselves, as ChainLink code is open source. Third parties may use other listing services (currently in development) in order to receive requests from smart contracts.
Any service provider can use ChainLink oracles for their smart contracts. If someone wants to use their own data for their smart contracts, they are free to connect to their own data source. Furthermore, the ChainLink team added, this depends on your perspective:
ChainLink is not just providing the infrastructure to help the development of smart contracts, but also building an ecosystem around this itself. Image: ChainLink
"As a data provider, how do I sell my data to smart contracts? The answer is to create an external adapter for my API, run a ChainLink node, and allow smart contracts to create requests to my oracle. As a general node operator, how do I sell data for X API? They would either need to create an external adapter themselves, which may not be viable if they’re not a developer (which is not a requirement), or they can find an open source implementation of an external adapter for the API they’re wanting to provide. We’ve built the ChainLink node to be modular by-design, so external adapters can easily be added by node operators to extend the functionality of their node without needing to know how to write programs."
The ChainLink ecosystem, today and tomorrow
Part of the value ChainLink brings is by providing the infrastructure for anyone to run an oracle, and part of it comes from its own oracle and ecosystem. There have been various names flying around, including a proof of concept project with SHIFT, and alleged "white label" partners such as Salesforce and Microsoft Azure.
The SHIFT proof of concept pulled interest rates from five banks (Barclays, BNP Paribas, Fidelity, Societe Generale, and Santander) and fed the data into a smart contract, which was used to make a payment that translated into a SWIFT payment message.
ChainLink clarified there are three types of projects that the ecosystem: Data providers, platforms/blockchains, and projects that use ChainLink oracles. Although ChainLink refrained from pointing to a comprehensive list, they pointed to a Decrypt article which mentions many collaborators and projects. There is a lot of speculation in the industry, they added, and they only confirm when official.
ChainLink provides more than the technical infrastructure here — they also provide an instance of this infrastructure, with vetted data providers onboarded. ChainLink emphasized that they work with top data partners for officially created adapters such as crypto price data, supply chain, etc.
Essentially, there are two layers of selection there: One on the oracle network, and one within each oracle. Users can choose which oracle(s) to use in the oracle network, and oracle nodes can choose which external services to connect to.
This also poses some interesting technical challenges. Essentially, oracles will act as data hubs, with data flowing in and out of them. How will the different data providers and data streams be cataloged, integrated, and managed? And what about issues related to data freshness, correctness, and performance?
Data selection and schema matching
ChainLink currently operates with a schema system based on JSON Schema, to specify what inputs each adapter needs and how they should be formatted. Similarly, adapters specify an output schema to describe the format of each subtask’s output.
Schema management at scale with data coming from various domains and sources is a sufficiently researched and documented topic, but that does not make it easy to deal with in practice. Especially when using JSON Schema, which is not the most advanced solution when it comes to schema management.
ChainLink has been used in a proof of concept to clear payments with SHIFT and global banks. Image: ChainLink
So what happens when there is no sufficient metadata on the data flowing through ChainLink? Not to mention, even sufficient metadata can be erroneous / misleading. What happens if i connect a data provider and claim it’s about topic A, but others say it really is about topic B, or C, or D and E? ChainLink says this is where decentralization plays a key role in the oracle problem:
"Just like how smart contracts are secure because they’re ran on multiple machines (blockchain nodes), you can secure the inputs to your smart contracts by having that input retrieved by multiple ChainLink nodes. So if you’re a requester, and you want data from a particular API DPA, you define how many ChainLink nodes you want to retrieve that data. To further decentralize your inputs, and if there are additional data providers with the same topic of data, you could have additional ChainLink nodes retrieve from another API DPB to assist with validation."
However, we would argue that while that does indeed address the topic of data source selection, it does not address that of schema matching: The terms used to describe what DPA and DPB is about could be different, and yet their data could be about the same thing. Based on JSON Schema, without a mechanism to align the metadata in place, nobody would ever know.
Data flow
From a data architecture perspective, ChainLink looks like a data hub through which data will flow transiently. However, a published list of use cases, interacting with databases and data in the cloud is mentioned.
We wondered whether there are implementations of such use cases to show for today. Plus, if this takes off, the amount of data flowing through ChainLink will be considerable. Would ChainLink consider storing any of that data in the oracle, for example for caching?
ChainLink’s view is that they like to think of it as an on-chain protocol that allows smart contracts and node operators to work with one another in a trust-minimized way:
The ChainLink oracle acts as a data hub, enabling data source selection on 2 levels: oracle and external data sources. Image: ChainLink
"This means that any endpoint that a node operator can access can be used by a smart contract through our protocol. We have a number of working implementations that give smart contracts the ability to retrieve data from authenticated data sources. Storing or caching data within the oracle is not currently a consideration since there are a number of security concerns associated with that. Data providers already have the facilities to store data long-term, and have the history and reliability of providing that data."
And what about the other way round? If a smart contract wants to send data to an external source, rather than store it on the blockchain, can ChainLink do this?
A ChainLink node can relay information from a smart contract to an external source. However, this would introduce an array of issues, as storing data in an external system means the tamper-proof aspect of data storage on the blockchain no longer applies.
Conclusion
So how will development for smart contracts on ChainLink look like? Does it come down to writing Solidity — which is not the easiest thing in the world for most people? Currently smart contracts create their requests from on-chain, and that request is picked up by the ChainLink node.
In the near future, ChainLink said, they will allow for requests to be initiated from off-chain services directly to a ChainLink node. This allows for requests to be created faster than the typical block time of the Ethereum network.
It also opens the door for faster blockchains to receive data at their native speed. ChainLink nodes can already query data on other blockchains with external adapters, the only caveat is a requester would need to use a ChainLink node with connectivity to that blockchain.
All in all, this is a much welcome development for smart contracts, Ethereum, and blockchain at large. It means the next step in the evolution of this ecosystem is now possible.
Granted, not everything is rosy, and smart contract and oracle development is bound to hit some of the same issues tantalizing software development and data management for decades. Hopefully known solutions to those issues can eventually be applied to foster the growth of this ecosystem, too.
Blockchain may be one of the most promising technologies today, but that may just as well be the reason why […]
Click here to view full article

iExec to Work with France’s Largest Utility Company to Streamline Infrastructure with Ethereum App

Click here to view original web page at www.newsbtc.comDecentralised cloud computing project iExec has announced a partnership with one of the world’s largest utility companies. The French energy giant EDF hopes to overhaul its cloud-based infrastructure by building an application on the Ethereum blockchain.
GPUSPH will reportedly take advantage of “the decentralized cloud”. The app gets around Ethereum’s scalability issues by ensuring that any heavy computing is done off-chain.
EDF to Build on Ethereum to Optimise Infrastructure
The fifth largest utility company on earth has announced a partnership with the decentralised cloud computing firm iExec. The two have built out an application known as GPUSPH and deployed it on the Ethereum blockchain.
According to a press release by iExec, GPUSPH will be used to model fluids used by the energy supplier. The research aims to further optimise how dams are constructed as well as lava cooling techniques.
EDF is France’s largest and the planet’s fifth largest utility company.
The release goes on to detail how iExec will be used to address the shortcomings of Ethereum with regards scalability. It states that the “heavy computing… is done off-chain and does not overwhelm Ethereum.”
The Ethereum blockchain is then used to find a consensus on the validity of results and a hash is stored to the blockchain.
EDF reportedly stands to benefit from increased network resilience, performance, and transparency by choosing to work using the blockchain solution provided by iExec.
Gilles Deleuze, the blockchain engineer at EDF stated the following of the new partnership:
“In a wider perspective, the development of distributed computing is a credible scenario for the future, and blockchain may be a nice lever in this scenario. The plan is to continue with other open scientific codes requiring possibly other types of worker pools.”
The release goes on to state that this is the first of many proposed experiments that the two companies will work on.
Has the iExec Token Price Responded Favourably to the Partnership?
Despite having no discernible use other than to pay for the services of the company itself, iExec has its own token. It was launched via initial coin offering (ICO) in April 2017.
Although spending much of 2019 in a gradual ascent, the last couple of week have seen the price bleed from a yearly high of over 88c down to just less than 48c at the time of writing.
With such a large partnership being announced, you would have expected iExec RLC (RLC) to be one of the best performers of the day on overall green day across the market. However, this has not been the case. Apparently, it will take more than a partnership with one of the planet’s largest energy suppliers for iExec bag-holders to finally get to off load their holdings.
Tags:ethereumfranceiExec Play BTC Games, ETH Games, USDT Games, BCH Games, LTC Games

Ethereum Classic May Delay Upcoming Hard Fork ‘Atlantis’

Click here to view original web page at www.coindesk.com
Ethereum classic’s open-source developer team failed to reach a consensus Thursday on whether to move forward with a forthcoming system-wide code upgrade as outlined, effectively sending the planned batch of upgrades back to a drafting stage.
Developers have been deliberating on a set of 10 proposals to be integrated into the protocol since February, an upgrade colloquially called “Atlantis.” A continuation of the original ethereum blockchain, ethereum classic (ETC) effectively broke away from the project in 2016, subsequently rising to a near $1 billion valuation, according to CoinMarketCap.
Still, while ethereum classic has strived to carve out a unique value proposition (based on an altered monetary policy among other differences), its community has also been making greater efforts to introduce changes to the network that would make interoperability between the two blockchains easier.
In fact, Atlantis is the first of two protocol upgrades or hard forks aimed at incorporating the EIPs that have already been activated on ethereum in recent years.
“These upgrades would bring ETC up to date with ETH’s latest protocol, making migration of dapps between the networks much easier,” wrote Bob Summerwill, the executive director of Ethereum Classic Cooperative in an email newsletter back in May.
Today, it was expected that the community would come to a final decision about the contents of the upgrade and its planned activation for mid-September. However, some developers expressed hesitation about including one particular proposal – EIP 170 – into the Atlantis upgrade.
Summarizing his thoughts on the proposal in a GitHub comment, ethereum classic developer Anthony Lusardi wrote:
“These rules can simply be applied to transaction validation rather than block validation, making it a soft fork rather than a hard fork… It’s vitally important to stick to pre-agreed rules when they’re defined.”
EIP 170
As background, EIP 170 if implemented would put a fixed cap on the size of smart contract code that can be run in a single transaction. This idea was originally conceived by ethereum founder Vitalik Buterin who explained at the time that a cap was necessary to prevent certain attack scenarios on the blockchain.
However like Lusardi, ethereum classic community member “MikO” argues this change doesn’t have to be a hard fork (i.e. backwards incompatible).
“I don’t like the idea of having to change a hard limit in the future if we desire more complex contracts,” wrote MikO on the ethereum classic Discord channel.
At the same time, both Lusardi and MikO emphasize their disagreements with EIP 170 should not delay or in any way obstruct the progression of the Atlantis upgrade.
MikO highlighted:
“If everyone feels setting this limit in this way is the way to proceed, then I agree with the majority.”
Lusardi added that outside of not wanting to delays the Atlantis upgrade, he doubly doesn’t believe that “any one person should be able to halt the [upgrade] process.”
For now, no decision has been made on the timeline or content of the Atlantis upgrade as a result of the comments shared in today’s developer call.
“Let’s just acknowledge there’s discussion around EIP 170 and take this time, another one or two weeks [to discuss] what’s the problem with a maximum code size limits and how to move on,” concluded ethereum classic developer soc1c.
Ethereum classic image via CoinDesk archives

Academic credential verification startup TrustED taps Binance’s blockchain platform

Click here to view original web page at www.tokenpost.comMon, 27 May 2019, 11:40 am UTC
Image: Shutterstock
TrustED, an Adelaide-based academic credential verification startup, has entered into an agreement with leading cryptocurrency exchange Binance to utilize Binance Chain.
Binance Chain, Binance’s public blockchain platform, was launched in April following the public testnet phase which commenced in February 2019.
Founded in 2017, TrustEd aims to offer technology and training to educators to help them store, issue, and verify academic credentials such as diplomas and certificates using blockchain technology. The objective is to digitize credentials and thwart the creation and issuance of fraudulent and falsified documents.
Be your own voice. Start your own newspaper now.
While the startup initially intended to utilize the Ethereum blockchain for its application use case, it has announced that it will use Binance Chain to realize its goal. With this, it has become one of the first startups to use Binance’s blockchain platform.
TrustEd noted that with one second block times and near-instant confirmation of transactions, Binance Chain is “poised to be a revolutionary stepping stone in the bid to bring cryptocurrencies and blockchain technology to the masses.”
"Being one of the first projects on Binance Chain is not only an honor but also a massive stepping stone for the TrustED project. With Binance technology behind us, TrustED can deliver on SLAs and security requirements necessary to make a blockchain-based academic solution enterprise-grade,” Kosta Batzavalis, TrustED CEO, said.
According to the official release, TrustED will be among the very first tokens to be launched on Binance Chain. TrustEd also plans to conduct a public token offering for its fundraising and community building efforts, which would be the first Initial Token Offering to take place with the native Binance Chain BEP2 Token standard.
"Binance Chain and the introduction of the Binance DEX enables thousands of crypto tokens and companies to utilize the technology in an efficient and effective manner,” Ted Lin, Chief Growth Officer at Binance stated. “We’re excited to have TrustED be one of the first startups to utilize Binance Chain and look forward to the growth that is to come in further bringing cryptocurrency mainstream."
Last week, Verasity, a digital currency for online video players, also partnered Binance Chain as part of its efforts to bring about a new incentivised video economy.
TrustED , an Adelaide-based academic credential verification startup, has entered into an agreement with leading cryptocurrency exchange Binance to utilize […]
Click here to view full article

Computer Researcher Finds Wallet Vulnerability That Gave Same Key to Multiple Users

Click here to view original web page at cointelegraph.com
Online cryptocurrency paper wallet creator WalletGenerator.net previously ran on code that caused private key/public key pairs to be issued to multiple users. The vulnerability was described in an official blog post by security research Harry Denley of MyCrypto on May 24.
According to the post, the bad code was in effect by August 2018, and was only recently patched out as of May 23. The live code on the website is reportedly supposed to be open source and audited on GitHub, but there were differences detected between the two. After researching the live code, Denley concluded that the keys were deterministically generated on the live version of the website, not randomly.
In one of MyCrypto’s tests between May 18–23, they attempted to use the website’s bulk generator to make 1,000 keys. The GitHub version returned 1,000 unique keys, but the live code returned 120 keys. Running the bulk generator always reportedly returned 120 unique keys instead of 1,000 even when other factors were tweaked, including browser refreshes, VPN changes, or user changes.
Randomness is needed to generate the key pairings in order for the paper wallets to be secure. As the post puts it:
“ELI5: When generating a key, you take a super-random number, turn it into the private key, and turn that into the public key / address. However, if the ‘super-random’ number is always ‘5,’ the private key that is generated will always be the same. This is why it’s so important that the super-random number is actually random…not ‘5.’”
WalletGenerator patched the determinism problem after MyCrypto reached out during the middle of its investigation. WalletGenerator purportedly responded afterward saying that the allegations could not be verified, and even asked the correspondent if MyCrypto was a “phishing website.”
MyCrypto added that users who generated keypairs after August 17, 2018 should immediately move their funds to a different wallet and recommended not to use WalletGenerator.net.
As previously reported by Cointelegraph, a so-called “blockchain bandit” made off with around 45,000 ether (ETH) by guessing weak private keys on the Ethereum blockchain.

Ethereum Upgrades as Hard Forks Constantinople and St Petersburg Activate on Blockchain

Click here to view original web page at www.coindesk.comThe long-anticipated upgrade Constantinople has officially activated on the ethereum blockchain.
At 19:57 (UTC), the sixth system-wide upgrade to be released since the second largest cryptocurrency by market cap launched in 2015 has successfully been rolled out onto the main network at block number 7,280,000.
But, that’s not all. The unusual part about today’s hard fork is that are two of them. St. Petersburg – ethereum’s seventh system-wide upgrade – has been released simultaneously and as intended has disabled part of the Constantinople code deemed back in January to host security vulnerabilities that could be used by attackers to steal funds.
It’s such a big upgrade, it’s important it goes well so as to not cause ethereum to split. So far, as seen on blockchain monitoring website Fork Monitor, there is no evidence of a significant chain split to suggest a portion of ethereum users are still running on old ethereum software.
As background, before any system-wide upgrade also called a hard fork, users such as miners and operators of ethereum-based applications are required to install new client software that automatically updates at the exact same block number.
This prevents two concurrent and incompatible versions of the same blockchain from splitting the wider network.
“With the blockchain, everyone has to upgrade in order for everyone to be able to use [the new] features,” explained Taylor Monahan – CEO of blockchain wallet tool MyCrypto.
That means everyone has to be prepared ahead of time.
“About two weeks before the fork, everyone upgrades the software but none of the new features are enabled,” said Monahan to CoinDesk. “Then, on that block number, everyone at the exact same time starts using the new features. So, that’s how we prevent differing states from existing simultaneously. It’s [also] called a consensus issue or a consensus bug.”
MyCrypto presently runs 10 to 15 computer servers also called nodes all running on the most updated version of the Parity ethereum client.
With today’s release of Constantinople and St. Petersberg, four different ethereum improvement proposals (EIPs) have been officially activated on the ethereum network – one of which does introduce a new “corner case” affecting smart contract immutability.
As of press time, the market price for ether – the main cryptocurrency of the network – has seen a small jump from $135.14 shortly prior to mainnet release and presently sits at $136.99, according to the CoinDesk Price Index.
Railroad tracks image via Shutterstock
At 19:57 (UTC), the sixth system-wide upgrade to be released since the second largest cryptocurrency by market cap launched […]
Click here to view full article

Ethereum Scaling Tech Monoplasma Wants to Let Dapps Broadcast Crypto

Click here to view original web page at www.coindesk.comBlockchain data platform Streamr has officially released a new open-source ethereum scaling technology called Monoplasma.
Inspired by a pre-existing scaling solution called plasma, Monoplasma is different in that it focuses specifically on “one-to-many payments” in which users would need to “repeatedly distribute value to a large and dynamic set of ethereum addresses,” explained Henri Pihkala, CEO of Streamr.
Speaking to CoinDesk about use cases for the technology, Pihkala said Monoplasma isn’t just about revenue sharing. Rather, the technology is envisioned for open-source decentralized applications (dapps) looking to incorporate “dividend distributions, staking rewards, repeated airdrops,” and more.
Demonstrating the power of Monoplasma on stage, Pihkala showed how the tool can be used to drop small amounts of fake “unicorn” tokens into 200,000 addresses on a test version of the ethereum blockchain.
Marketed as a “special-purpose off-chain scaling solution,” Shiv Malik, head of communications for Streamr, likened the technology to “broadcasting money.”
“You can receive money, but you can’t send back the other way. That would be like trying to send a message to your TV,” Malik said.
As such, no double spends – where tokens are essentially counterfeited – are able to occur on a Monoplasma payment channel. “On the side channel, you can only earn money,” emphasized Pihkala.
Unidirectional Monoplasma payments system. Image courtesy of Streamr.
Streamr intends to use the technology to crowdsell user data on a blockchain. Once data from users is sold to a bidding company, payment will directly be pushed into users’ ethereum addresses.
Revealed last May, Streamr has partnered with a number of tech conglomerates including Hewlett Packard Enterprise and Finnish telecom company Nokia.
Now, all ethereum developers are encouraged to try out Monoplasma by downloading the public code repository on GitHub.
Pihkala concluded:
“If someone else finds use in [Monoplasma] that’s awesome, that’s what makes us happy. But at the very least, we’re going to build on top of it – meaning [Monoplasma] is going to be well maintained. It’s not about to be abandoned anytime soon.”
Streamr logo via CoinDesk archives
Inspired by a pre-existing scaling solution called plasma , Monoplasma is different in that it […]
Click here to view full article

EU Commissioners, Ministers, Royalty to Speak at Austria’s Blockchain Summit

Click here to view original web page at www.trustnodes.com
The European Commissioner for the Digital Economy and Society, Mariya Gabriel, Prince Michael of Lichtenstein, an Austrian minister, and numerous others, are to speak at what might be Europe’s biggest blockchain event to be held in Vienna on April 2nd and 3rd.
“This event is going to put Vienna on the map as one of the best places for business and blockchain to get together, network, communicate and build their visions of the future,” a statement says.
They hope to attract 2,000 people, with 80 national and international speakers, 40 exhibitors and 100 investors.
The main topics of the event will be government, blockchain for business, healthcare, energy, banking, supply chain and mobility, and alongside Microsoft, IBM and Accenture, there will also be representatives from Hyperledger, Bitfury, Bitmain, Raiffeisen Bank International, Wien Energie and Merck.
The German airline, Lufthansa, will also be there, as well as representatives from Binance, with the summit supported by Austria’s government.
“The focus of the conference is on promoting the discourse between public and business sectors, science and research, and art and technology, and building bridges between these industry sectors,” a press release says.
Sebastian Kurz, the Cryptonian?
Europe’s youngest leader, Sebastian Kurz, made headlines for being the only millennial to head a state after becoming Austria’s Chancellor in December 2017.
It appears one of his first priority was to court the blockchain space, with Harald Mahrer, the Austrian Minister of Economy, stating in an announcement on blockchain strategy:
“To become an innovation leader, we must walk down new paths without taboos and cope with technologies that will radically change many areas of our life tomorrow.”
The conservative government then announced they were to issue €1.15 billion of government bonds on ethereum’s blockchain.
Blockchenizing Europe
While American authorities have been quick to intervene in the blockchain space, EU wide policies have been slow to come by.
There are pockets of success. Estonia, Switzerland, Lithuania, and until the Brexit referendum, a leading London.
France in addition has tried to attract crypto blockchain businesses by passing laws that facilitate Initial Coin Offerings (ICOs).
This accommodative approach by some member states has led to Europe passing America in ICO fundraising. $4.1 billion was raised by projects based in Europe, while Americans raised $2.6 billion last year.
At the European Commission level, however, there hasn’t been much development. There have been some statements of aspiration, but the common market hasn’t quite been able to offer a real alternative to the United States.
That might partly be because the US Securities and Exchanges Commission (SEC) has effectively exerted jurisdiction over EU based projects as shown by their charging of an Austrian with securities violations.
It would thus be very interesting to hear what the European Commissioners will say at this blockchain summit, especially as it pertains ICOs and more generally on how they can provide a real alternative to the USA for the crypto space.
Copyrights Trustnodes.com

Zilliqa Mainnet Is The First To Use The Sharding Protocol, Processes 2,500 transactions Per Second

Click here to view original web page at smartereum.com$1 billion coin market cap zilliqa
Zilliqa is a smart contract platform that went live recently. The platform is similar to Ethereum and it has just launched its mainnet. The reports from the development team is that this platform has successfully implemented the sharding protocol. This was allegedly done via a combination of a hybrid consensus algorithm, Practical Byzantine Fault Tolerance, and sharding. This increased the transaction speed to 2,500 transactions per second. Unlike the Ethereum developers, Zilliqa developers do not intend to migrate to the proof-of-stake consensus algorithm. They are looking for a faster way to reach scalability.
Sharding Proof-of-work And Practical Byzantine Fault Tolerance
Sharding is the next big thing in blockchain scalability. The sharding protocol distributes the workload throughout the network rather than letting the blockchain process every transaction on all its nodes at the same time. This method increases the number of transactions processed per second. In this case, the number of transactions processed per second is 2,500.
The blog post from Zilliqa, last year, stated that it is possible for the blockchain to be broken down and distributed. This allows for the distribution of sufficiently large subsets of the hashrate and nodes. This, in turn, keeps the network secure while the load is being distributed.
The Zilliqa chain is an elaboration of the word silica. It is the central component in circuit boards. It makes use of the proof-of-work consensus mechanism to prevent Sybil attacks while the general consensus is performed by the Practical Byzantine Fault Tolerance consensus mechanism. This makes it a hybrid consensus protocol.
Ethereum (ETH) Price Today – BTC / USD
The Zilliqa team believe that this hybrid protocol is more beneficial because the proof-of-work period on the network would last for about one minute in every two to three hours. So, the energy footprint of mining will be smaller compared to other blockchains that use only the proof-of-work consensus mechanism on every block.
Zilliqa Mainnet Will Be In The Bootstrap Phase
The team announced that the mainnet will be in the bootstrap phase until match 2019. During this period, the miners will not process any transaction but they will be rewarded for their computational power as usual. The goal is to ensure that the network is fully protected from bad actors during the initial launch period when the computational power is still relatively low.
Zilliqa (ZIL) Price Today – ZIL / USD
Recently, there were some rumors about Zilliqa making plans to supply Facebook with its blockchain technology. Yesterday, during an ask-me-anything engagement segment with the Twitter cryptocurrency community, the company quashed the rumors. During a segment a user asked:
“Do you guys have/had any collaboration with Facebook?”
In reply, Xishu Dong answered
“Apart from the fact that Evan is our advisor, no. We do not.”
What Does Zilliqa’s Implementation Of Sharding Mean For Ethereum?
Ethereum has been advertising its sharding protocol, Serinity, for a long time now. Unfortunately, it has been unable to implement it. This has given Zilliqa the opportunity to become the first blockchain to implement sharding. While this may not necessarily be a big deal, it isn’t exactly in favor of Ethereum, a blockchain that is currently struggling to maintain its position as the world’s best decentralized computer.
Do you think Zilliqa’s implementation of its own sharding protocol will make it a competition for Ethereum?