Hive Power validates its tools in an innovative “energy self-consumption community”

Hive Power, in partnership with SUPSI, AEM, Optimatik, and Landis + Gyr, launches the pilot project “Lugaggia Innovation Community” (LIC). It aims to test innovative solutions that foster the penetration of local renewable energy sources by optimizing their profitability and minimizing their impact on the grid. The project has the support of the Swiss Federal Office of Energy, the Cantonal Fund for Renewable Energy (FER) and the Authority for Development of Lugano Region (ERSL) and involves eighteen households and the local kindergarten of Lugaggia, a village in the region of Lugano.

The transition to renewable energy sources is one of the key objectives of the Swiss and international Energy Strategy in response to climate change. To implement this transition, it is necessary to radically rethink the way electrical power is delivered. In particular, Paolo Rossi, director of the local DSO (AEM SA), notes that “Many of our customers do not just consume energy: they also produce it with rooftop photovoltaic systems. Most of the times, they manage to self-consume only a small part (20%) of the electricity produced by their photovoltaic system, while most of it (80%) is fed back into the electricity grid. Such reverse power flow in LV poses a challenge to the stability of the grid, which was originally designed to supply power to users and not take it away from them. It also represents a direct financial loss for the user, given that the electricity fed into the network is paid much less than that taken from the network. With the LIC project, we try to overcome these problems”.

Hive Power boosts community profitability

The innovative solution, being tested in Switzerland, leverages on the possibility, recently authorized by the Federal Energy Ordinance, to create a “self-consumption community”: a contractual grouping of consumers and producers who jointly use self-produced solar energy.

The innovative solution, being tested in Switzerland, leverages on the possibility, recently authorized by the Federal Energy Ordinance, to create a “self-consumption community”: a contractual grouping of consumers and producers who jointly use self-produced solar energy.
To reduce energy costs for consumers and increase the profitability of photovoltaic systems within the community, the self-consumption must be maximized. To this end, a 60 kWh storage battery was installed. A demand-side management scheme that coordinates the activation of deferrable loads, such as heat pumps and electric water heaters, further increases community self-consumption. The HONEY distributed control scheme developed by Hive Power will be used to control both the community storage and the deferrable loads of the community members, to boost the profitability of the community, while ensuring grid stability.

Hive Power also plans to test its blockchain-based tools for the management of shared assets (using the community storage as an example) and billing automation.

More info at


Trustless coordination mechanism for smart grid energy markets, a game theoretic approach

Increasing the amount of  installed renewable energy sources such as solar and wind is an essential step towards the decarbonization of the energy sector.

From a technical point of view, however, the stochastic nature of distributed energy resources (DER) causes operational challenges. Among them, unbalance between production and consumption, overvoltage and overload of grid components are the most common ones.

As DER penetration increases, it is becoming clear that incentive strategies such as Net Energy Metering (NEM) are threatening utilities, since NEM doesn’t reward prosumers to synchronize their energy production and demand.

In order to reduce congestions, distributed system operators (DSOs) currently use a simple indirect method, consisting of a bi-level energy tariff, i.e. the price of buying energy from the grid is higher than the price of selling energy to the grid. This encourages individual prosumers to increase their self-consumption. However, this is inefficient in regulating the aggregated power profile of all prosumers.

Utilities and governments  think that a better grid management can be achieved by making the distribution grid ‘smarter’, and they are currently deploying massive amount of investments to enforce this vision.

As I explained in my previous post on the need of decentralized architectures for new energy markets,  the common view of the scientific community is that a smarter grid requires an increase in the amount of communication between generators and consumers, adopting near real-time markets and dynamic prices, which can steer users’ consumption during periods in which DER energy production is higher, or increase their production during high demand. For example, in California a modification of NEM that allows prosumers to export energy from their batteries during evening peak of demand has been recently proposed.

But as flexibility will be offered at different levels and will provide a number of services, from voltage control for the DSOs to control energy for the transmission system operators (TSOs), it is important to make sure that these services will not interfere with each other. So far, a comprehensive approach towards the actuation of flexibility as a system-wide leitmotiv, taking into account the effect of DR at all grid levels, is lacking.

In order to optimally exploit prosumers’ flexibility, new communication protocols are needed, which coupled with a sensing infrastructure (smart meters), can be used to safely steer aggregated demand in the distribution grid, up to the transmission grid.

The problem of coordinating dispatchable generators is well known by system operators and has been studied extensively in the literature. When not taking into account grid constraints, this is known under the name of economic dispatch, and consists in minimizing the generation cost of a group of power plants . When operational constraints are considered, the problem increases in complexity, due to the power flow equations governing currents and voltages in the electric grid. Nevertheless, several approaches are known for solving this problem, a.k.a. optimal power flow (OPF), using approximations and convex formulations of the underlying physics. OPF is usually solved in a centralized way by an independent system operator (ISO). Anyway, when the number of generators increases, as in the case of DERs, the overall problem increases in complexity but can be still effectively solved by decomposing it among generators.

The decomposition has other two main advantages over a centralized solution, apart from allowing faster computation. The first is that generators do not have to disclose all their private information in order for the problem to be solved correctly, allowing competition among the different generators. The second one is that the computation has no single point of failure.

In this direction, we have recently proposed a multilevel hierarchical control which can be used to coordinate large groups of prosumers located at different voltage levels of the distribution grid, taking into account grid constraints. The difference between power generators and prosumers is that the latter do not control the time of generated power, but can operate deferrable loads such as heat pumps, electric vehicles, boilers and batteries.


Fig.1 Hierarchical nature of the electric grid. The grid is divided in different voltage levels (low, medium, high), each of which is operated by different entities (DSOs, TSOs). Coordination can be exploited sending messages with a forward/backward strategy following this structure.

The idea is that prosumers in the distribution grid can be coordinated only by means of a price signal sent by their parent node in the hierarchical structure, an aggregator.  This allows the algorithm to be solved using a forward-backward communication protocol. In the forward passage each aggregator receives a reference price from its parent node and sends it downwards, along to its reference price, to its children nodes (prosumers or aggregators), located in a lower hierarchy level. This mechanism is propagated along all the nodes, until the terminal nodes (or leafs).  Prosumers in leaf nodes solve their optimization problems as soon as they are reached by the overall price signal. In the backward passage, prosumers send their solutions to their parents, which collect them and send the aggregated solution upward.

Apart from this intuitive coordination protocol, the proposed algorithm has other favorable properties. One of them is that prosumers only need to share information on their energy production and consumption with one aggregator, while keeping all other parameters and information private. This is possible thanks to the decomposition of the control problem. The second property is that the algorithm exploits parallel computation of the prosumer specific problems, ensuring minimum overhead communication.

However, being able to coordinate prosumers is not enough.

The main difference between the OPF and DR problem, is that the latter involves the participation of self-serving agents, which cannot be a-priori trusted by an independent system operator (ISO). This implies that if an agent find it profitable (in terms of its own economic utility), he will compute a different optimization problem from the one provided by the ISO. For this reason, some aspects of DR formulations are better described through a game theoretic framework.

Furthermore, several studies have focused on the case in which grid constraints are enforced by DSOs, directly modifying voltage angles at buses. Although this is a reasonable solution concept, the current shift of generation from the high voltage network to the low voltage network lets us think that in the future prosumers and not DSOs could be in charge of regulating voltages and mitigating power peaks.

With this in mind, we focused on analyzing the decomposed OPF using game theory and mechanism design, which study the behavior and outcomes of a set of agents trying to maximize their own utilities $latex u(x_i,x_{-i})&s=1$, which depend on their own actions $latex x_i &s=1$ and on the action of the other agents $latex x_{-i}&s=1$, under a given ‘mechanism’. The whole field of mechanism design tries to escape from the Gibbard–Satterthwaite theorem, which can be perhaps better understood by means of its corollary:

If a strict voting rule has at least 3 possible outcomes, it is non-manipulable if and only if it is dictatorial.

It turns out, that the only way to escape from this impossibility result, is adopting money transfer. As such, our mechanism must define both  an allocation rule and a taxation (or reward) rule. In this way, the overall value seen by the agents is equal to their own utility augmented by the taxation/remuneration imposed by the mechanism:

$latex v_i (x_i,x_{-i})= u_i(x_i,x_{-i}) + c_i(x_i,x_{-i}) &s=1$

Anyway, monetary transfers are as powerful as perilous. When designing taxes and incentives, one should always keep in mind two things:

  • Designing wrong incentives could result in spectacular failures, as we learned from the case of a very anecdotal misuse of incentives from British colonial history, known as the cobra effect
  • If there is a way to fool the mechanism, self-serving prosumers will almost surely find it out.  Know that some people will do everything they can to game the system, finding ways to win that you never could have imaginedSteven D. Levitt

A largely adopted solution concept, used to rule out most of the strategic behaviors from agents (but not the same as strategyproof mechanism), is the one of ex-post Nash Equilibrium (NE), or simply equilibrium,  which is reached when the following set of problems are jointly minimized:

\min_{x_i \in \mathcal{X}_i} & \quad v(x_i, x_{-i}) \quad \forall i \in \{N\} \\
s.t. & \quad Ax\leq b

where $latex x_i \in \mathcal{X}_i &s=1$ means that the agents’ actions are constrained to be in the set $latex \mathcal{X}_i &s=1$, which could include for example the prosumer’s battery maximum capacity or the maximum power at which the prosumer can draw energy from the grid.  The linear equation in the second row represents the grid constraints, which is a function of the actions of all the prosumers, $latex x = [x_i]_{i=1}^N &s=1$, where N is the number of prosumers we are considering.

Rational agents will always try to reach a NE, since in this situation they cannot improve their values given that the other prosumers do not change their actions.

Using basic optimization notions, the above set of problems can be reformulated using KKT conditions, which under some mild assumptions ensure that the prosumers’ problems are optimally solved. Briefly, we can augment the prosumers objective function using a first order approximation, through a Lagrangian multiplier $latex \lambda_i$, of the coupling constraints and using the indicator function to encode their own constraints:

$latex \tilde{v}_i (x_i,x_{-i}) = v_i (x_i,x_{-i})  + \lambda_i (Ax-b) + \mathcal{I}_{\mathcal{X}_i} &s=1$

The KKT conditions now reads

0& \in \partial_{x_i} v_i(x_i,\mathrm{x}_{-i}) + \mathrm{N}_{\mathcal{X}_i} + A_i^T\lambda \\
0 & \leq \lambda \perp -(Ax-b) \geq 0
\end{aligned} &s=1

where $latex \mathrm{N}_{\mathcal{X}_i}&s=1$ is the normal cone operator, which is the sub-differential of the indicator function.

Loosely speaking, Nash equilibrium is not always a reasonable solution concept, due to the fact that multiple equilibria usually exists. For this reasons equilibrium refinement concepts are usually applied, in which most of the equilibria are discarded a-priori. Variational NE (VNE) is one of such refinement. In VNE, the price of the shared constraints paid by each agent is the same. This has the nice economic interpretation that all the agents pay the same price for the common good (the grid). Note that we have already considered all the Lagrangian multiplier as equal $latex \lambda_i = \lambda \quad \forall i \in \{N\}&s=1$ in writing the KKT condition.

One of the nice properties of the VNE is that for well behaving problems, this equilibrium is unique. Being unique, and with a reasonable economic outcome (price fairness),  rational prosumers will agree to converge to it, since at the equilibrium no one is better off changing his own actions while the other prosumers’ actions are fixed. It turns out that a trivial modification of the parallelized strategy we adopted to solve the multilevel hierarchical OPF can be used to reach the VNE.

On top of all this, new economic business models must be actuated in order to reward prosumers for their flexibility. In fact, rational agents would not participate in the market if the energy price they pay is higher than what they pay to their current energy retailer. One of such business models is the aforementioned Californian proposal to enable NEM with the energy injected by electrical batteries.

Another possible use case is the creation of an self-consumption community, in which a group of prosumers in the same LV grid, pays only at the point of common coupling with the grid of the DSO (which e.g. could be the LV/MV transformer in figure 1). In this way, if the group of prosumers is heterogeneous (someone is producing energy while someone else is consuming), the overall cost that they pay as a community will be always less than what they would have paid as single prosumers, at the loss of the DSO. But if this economic surplus drives the prosumers to take care of power quality in the LV/MV, the DSO could benefit from this business model, delegating part of its grid regulating duties to them.

How does blockchain fits in? Synchronizing thousands of entities connected to different grid levels is a technically-hard task. Blockchain technology can be used as a thrust-less distributed database for creating and managing energy communities of prosumers willing to participate to flexibility markets. On top of the blockchain, off-chain payment channels can be used to keep track of the energy consumed and produced by prosumers and to disburse payments in a secure and seamless way.

Different business models are possible, and technical solutions as well. But we think that in the distribution grid, the economic value lies in shifting the power production and consumption of the prosumers, enabling a really smarter grid.

At Hive Power we are enabling the creation of energy sharing communities where all participants are guaranteed to benefit from the participation, reaching at the same time a technical and financial optimum for the whole community.

Key links:

[maxbutton id=”1″]


Reimagining A Cryptocurrency Energy Economy with Hive Power

We’ve always claimed that our vision at Hive Power is to create a world that shares energy to ensure a better and brighter future. One of the ways we are doing just that, is by bringing solutions to a range of challenges being faced not just in the energy market, but also across blockchain technologies and projects. We thought we would take a moment to explore some of the key problems cryptocurrency energy initiatives are currently facing and how Hive Power is reimagining better ways forward.

Problem 1: The energy market needs to evolve towards a more sustainable, economic future.

As the energy market has begun to shift away from larger power plants to smaller decentralized sources, a major gap exists in the development of exchange models that can help keep energy usage sustainable, while also lowering prices for consumers.

The Hive Power Solution: While, the energy market is certainly facing an array of challenges, we created a turnkey solution utilizing Ethereum based smart contracts to mitigate the most pressing energy industry issues.   

We designed the Hive Token (HVT) to allow users to create and manage what we call “Hives”- distributed energy market platforms that can be implemented using Ethereum based smart contracts.  Just think, if you or your neighbor had an excess of energy, but, also had the ability to share this usable power with another individual in your community, it could strategically reduce costs for all involved, optimize the consumption and create a more sustainable model for future demands on the grid.  For those already looking to control their own energy future, registering for our ICO at might be your solution.

Problem 2: Lack of research and expertise across cryptocurrency teams and leadership  

Blockchain projects are often led by teams that may understand cryptocurrency, but have a very limited expertise in the business sector with which they intend to operate. This has left buyers, enthusiasts and users with a quickly dwindling value.  The failures of a large number of projects can often be attributed to a lack of research and expertise on all levels.

The Hive Power Team Difference: The team at Hive Power has always been proud to be a part of  the world’s most well regarded and sophisticated research institutions exploring the energy industry. In fact, our startup was incubated at SUPSI (The University of Applied Sciences and Applied Arts of Southern Switzerland) and developed based on deep knowledge and expertise earned over years of academic and corporate research. From our CEO’s career as asset manager for European solar parks to our COO Davide still leading the Energy systems research sector at SUPSI today, we maintain an in-depth knowledge of  the challenges we face and the solutions we can offer. A detailed guide to both can be found within our white-paper

Problem 3:  Conceptual projects lacking real-world business initiatives.

 A wide variety of cryptocurrency projects have dynamic sounding solutions to some of the world’s most pressing social and financial problems, but no real idea on how to achieve them, or the actual operational capability to see them through.

The Hive Power Way: Hive Power is more than a concept or just a sales pitch that has been coupled with a flashy white paper. We have developed our technology prototype and are already working with world-acclaimed partners who span the corporate, public and academic sectors. It is these strong relationships that have already helped to move our project from its idea phase, into being actionable. It’s not only that partners like Landis-Gyr, and the Swiss Government’s center for energy research believe in and support the work that we are doing,  but that they are actually hand-in-hand helping us to enter the market successfully. Have a great idea on who else we should partner with next? Bring it up on our telegram community and join the conversation.

Make sure to stay tuned for “Reimagining a Cryptocurrency Energy Economy Part 2”, where we will provide depth on the core technologies we are working, how they are being implemented and how we differentiate from our competitors.  

Key links:

[maxbutton id=”1″]


Exciting New Partnership Announcement for Hive Power & Liquidity.Network

We are thrilled to announce a strategic partnership between Hive Power & Liquidity.Network.

Why Hive Power and Liquidity Network perfectly fit together?

Crafting and maintaining energy communities that collaborate through a public blockchain requires a scalable, congestion resilient and secure payment settling system. As participants will develop their respective optimizations and efficient trading strategies, the settlement system has to remain responsive and cost-efficient, even in the event of a significant growth.

Liquidity.Network’s off-chain payment system is the current state of the art real-time off-chain payment solution that has already been deployed on the Ethereum Testnet. Opening payment channels is performed off-chain, and thus for free, which allows the Hive Power communities to grow significantly while keeping operating costs to a minimum. Liquidity.Network’s trustless solution moreover features no rigidly locked funds, low maintenance costs, easy routing and off-chain refills. As such, Liquidity.Network’s solution perfectly fits Hive Power’s market solution.

Energy Market Disruption

Energy markets are undergoing enormous changes, pushed by the increase of renewable generation and the emergence of innovative smart grid solutions. It’s likely that we will move towards a regional approach to grid control and security of supply. Newly improved price signals will be needed at a local level and energy markets will develop in the distribution grid to efficiently deal with problems like balancing and demand-side management. A higher number of stakeholders will be involved in these markets, as small producers and consumers will join them.

As a consequence, the amount of energy traded per market participant will decrease but, at the same time, real-time pricing will become more important, especially for grid balancing.

Let’s summarize:

  • Local energy markets are coming
  • Trading will be fast, to respond to rapidly changing price signals
  • Many stakeholders will be involved
  • The amount of the energy per trade will be small, corresponding to micro payments

Hive Power aims to lead this transition. Hive Power’s core business is the development of a turnkey solution for the creation and management of local energy communities on the blockchain, providing an economic optimization through the development of efficient trading mechanisms. For such solutions to work, there is the fundamental need for an efficient way to transfer money between the market participants in a fast, secure and cost-efficient manner.

Liquidity.Network for Scalable Payments

When one attempts to implement local energy markets on a public blockchain purely based on smart contracts, one systematically clashes with the reality of having to pay too high transaction fees (both in money and in energy) and with a clear lack of scalability. Among the strategies proposed to deal with these issues, off-chain transaction systems are at moment those with the highest odds of success, at least in the short term.

That’s where Liquidity.Network comes into play. Liquidity.Network off-chain payment network is perfectly tailored to Hive Power’s market solution, as it allows real-time small value transfer at very low transaction costs in a secure and transparent way (actions are auditable). Another important innovative aspect of Liquidity.Network’s solution, which distinguishes it from other off-chain payment solutions, is the implementation of n-party payment hubs. This is ideal for Hive Power’s use case, in which the number of market players can be high and, under certain circumstances, they will not know a priori with whom they will trade.

Liquidity.Network’s Competitive Advantages

This Section outlines the competitive advantages of Liquidity.Network with respect to alternative off-chain payment systems. As Hive Power’s platform is based on the Ethereum blockchain, we compare Liquidity.Network with Raiden, a known off-chain payment solution for Ethereum, and discuss why we believe that Liquidity.Network offers a better alternative.

Low Maintenance Costs

In Raiden for any payment channel (PC), a user has to deploy a new contract on the blockchain. This design choice results in a huge number of equal contracts deployed, instead to have a unique one able to manage (i.e. open/close/refill) many PCs. Besides, the contracts proliferation has no benefits in terms of security, being all of them equal and so vulnerable to an eventual bug.

Liquidity.Network provides a light and secure approach to the creation of new PCs. Opening a payment channel is free and instantaneously done off-chain (see Opening a payment channel moreover does not require the deployment of new contracts, avoiding to have a huge amount of equal contracts operating on the blockchain.

No Rigid Funds Locked

In Raiden, only 2-party payment channels can be established and each one of them requires collateral to be locked up. Using this solution for the implementation of a local market with a decent amount of users would require a huge amount of collateral to be staked.

Liquidity.Network provides n-party payment hubs. No collaterals are required for the channel openings, while trustlessness is retained. The overall collateral requirements of a Liquidity.Network hub are significantly lower than if Raiden were to build a hub using their 2-party payment technology.

Off-chain refill

A refilling of an open PC can be performed in Raiden, but only with an expensive on-chain transaction.

In Liquidity.Network refills of the PCs can be performed without on-chain transactions.

Easy Routing

Being based on a hub-network structure, Liquidity.Network requires significantly less routing than non-hub solutions like Raiden. Indeed, practically the routing acts at the hubs level, which can be efficiently interconnected in terms of decentralization and redundancy.

As it can be seen, the benefits of implementing Liquidity.Network as an off-chain payment system are substantial. With this solution, the energy sharing communities are guaranteed the ability to transfer money between the market participants in a fast, secure and cost-effective manner. For this reason both the Hive Power and Liquidity.Network teams are thrilled to work together towards the revolution of the energy market, where local energy communities will prevail, trading will be faster, and a lot of stakeholders will be involved.

We refer you to learn more about Blockchain payments for everyone by Liquidity.Network:

To learn more about the energy sharing communities by Hive Power go here:

Don’t forget to join our Telegram Channel:


Collaboration with Eidoo ICO Engine for Hive Power’s Crowdsale — Everything you need to know

Hive Power is happy to collaborate with Eidoo ICO engine in order to be completely compliant with the new regulations of the Swiss Financial Market Supervisory Authority FINMA released earlier this year. Among other things, the regulations are focused on anti-money laundering, that includes the need of establishing the identity of beneficial owners.

This is where Eidoo ICO Engine steps in, providing Tier 1 and Tier 2 identification for its users. In this step-by-step guide you will find how to register to Eidoo ICO Engine and how to participate to the Hive Power crowdsale on the 12th of June 2018.

To participate to the Hive Power ICO, you need to register to the Eidoo ICO Engine website and identify yourself through the KYC (Know Your Customer) procedure. In order to do so, you need to complete Tier 1 (up to 3.000 CHF per year) and Tier 2 (up to 500.000 CHF per year). For each Tier, you will be required to provide different documents, which will be specified below. This is necessary in order to comply to future AML (Anti Money Laundering) requirements and to avoid unauthorized people to participate in the ICO.

After the KYC procedure, you will receive your referral link and you will be able to participate to the Hive Power airdrop!

Following, there’s a step-by-step guide to help you going through the KYC procedure, get your referral link and participate to Hive Power airdrop!

Before starting, note that Chinese and U.S. citizens won’t be able to join the ICO.

1) Download the Eidoo app

The app is available for iOS and Android. You will get your wallet address, necessary to send ETH and buy the tokens.

Please, use Google Chrome from now on.

2) Register on the Eidoo ICO Engine Website

Eidoo 1 HD

Go to Eidoo ICO Engine

To register, you will need to provide the following information: name, surname, email, and password.

You will receive an email with a link to confirm your registration. This link will redirect you to your first log-in.

Eidoo 2 HD

In the upper-right corner of the page, click on your profile. Now you will see your profile page: you can enter your name and surname (or a nickname), photo, description and choose your privacy settings.

Finally, you can add a 2-factor authentication using Google Authenticator for increased security.

3) Identification — Tier 1

Eidoo 3 HD

To complete Tier 1, click on the left menu the “Identification” item.

Remember that Tier 1 will allow you to join the ICO with a maximum of 3.000 CHF per year.

Here you will need to enter the following information: name, surname, address, house number, city, zip code, country, email, telephone number, nationality, date of birth, etc. You will also need to enter your Eidoo wallet address and declare that you are not an American citizen or subject in any way to US taxation.

4) Identification — Tier 2

After completing Tier 1, accept the “Terms and Conditions” and click on “Upgrade to Tier 2”.

Remember that Tier 2 will allow you to join ICOs with a maximum of CHF 500.000 per year.

Eidoo 4 HD

In Tier 2, you will be asked:

  • To proof your identity (passport OR driving license OR identity card)
  • To proof your residence (bank statement OR utility bill)
  • A selfie (you can upload it OR take one with the webcam)
  • To select the type of verification (private OR business)
  • To agree to term and conditions

Depending on whether you click on “Private” or “Business” you will need to fill form A (private) or form K (business). Some fields of this form will be automatically filled. Finally, you will need to sign the document by using the Google Authenticator 2FA (2 Factor Authentication).

Eidoo 5 HD

5) Connect the ICO Engine profile to the eidoo wallet

After completing the KYC procedure, you need to connect your verified ICO Engine profile to your eidoo wallet address. To complete this step, return to your profile page and click on the left menu the “My addresses” item. Here you will need to insert your eidoo wallet address, which is obtained with the download of the eidoo app (Step 1).

2018-06-12 23.19.49.jpg

6) Buy the HVT tokens

Finally, to buy the Hive Token, you need to log-in on the eiddo app, click on “ICO list“, choose the Hive Power ICO and insert the amount of ETH you want to use to buy HVT.


7) Referral URL

After the registration, the participants will receive a unique referral URL (on the website). This URL will be used to reward participants who will promote Hive Power to their friends: participants will receive a token bonus equivalent to the 5% of tokens bought by referred buyers. The bonus tokens will be distributed from the Airdrop and Referral Program fund, after the end of the Crowdsale.

Guide to get your referral link

The referral link can be found at the Hive Power profile on ICO Engine:

Referral 1 HD

Once you click on the “get referral code” button, the system will generate your unique referral code that you can share with your network.

Here you can see an example of the generated code:

Referral 2 HD

If you want to see all the referral links you have, you can easily do this from your ICO Engine personal profile:

The greater your referral clicks, the bigger your reward for spreading the word about Hive Power!

Referral 3 HD

8) Airdrop

In August 2018, Hive Power will distribute 500’000 HVT tokens to reward HVT token holders. Users who didn’t move their HVT tokens after the Crowdsale will be eligible to receive the airdrop. The airdrop fund will be distributed to token holders proportionally to their HVT balance.

Something isn’t clear? For questions please join our Telegram channel

At Hive Power we are enabling the creation of energy sharing communities where all participants are guaranteed to benefit from the participation, reaching at the same time a technical and financial optimum for the whole community.

Join our Telegram Channel:

Key links


Why the energy market needs decentralized architectures

One of the reasons for the high popularity of cryptocurrencies is that they allow a decentralized economic system. This point is unanimously considered pivotal in the crypto world, where the worst insult for a project is calling it ‘centralized’. However, the definition of decentralization is often hardly understood. Even worse, a recent study shows that cryptocurrencies are not so decentralized as one might think, considering that the top four miners in Bitcoin and the top three miners in Ethereum control more than 50% of the hash rate.

Decentralization is often explained in terms of the communication architecture of a network, and a distinction is usually made between decentralized and distributed networks. This very famous picture explains the differences eloquently:

Differences between centralized, decentralized and distributed architectures

The picture is somehow self-explanatory, but we can try to give a tentative definition of the three architectures:

  • centralized architecture: information passes from a single node
  • decentralized architectures: not all information passes from a single node
  • distributed architectures: nodes communicate only with their neighbors

This (very personal) definition makes distributed architectures a subclass of decentralized architectures.
This picture was firstly published in 1964 by Paul Baran, a pioneer in the field of computer networks. When Baran published his work, he was considering distributed networks for increasing the resilience of the national communication structure in the context of a possible atomic war:

…it can be shown that highly survivable system structures can be built, even in the thermonuclear era

— Baran, Paul. “On distributed communications networks.” IEEE transactions on Communications Systems 12.1 (1964): 1–9.

From this point of view, in which the communication network is operated by a trusted entity (the government) and the architecture has the only purpose to guarantee communication, there is no further need to consider decentralization under other aspects. In this post on the meaning of decentralization, Vitalik Buterin explains why, for cryptocurrencies and distributed ledger technologies, there is the need to expand the definition of the decentralization grade of a system.

Briefly speaking, three key aspects can be considered:

  • architectural decentralization: this coincides with the definition given by Baran. The system should be geographically dislocated in order to be robust against malicious attacks. The implicit assumption is that attacker’s costs are sublinear: destroying a very big central unit is cheaper than destroying thousands of smaller dislocated communication units.
  • political decentralization: decisions concerning the protocols running the network should be made by several individuals/organizations with no concurrent interests in manipulating the network. This aspect is important once we exit from a ‘us’ against ‘them’ mind setting, in which the network operator is trusted and the system must be protected against outsider’s attacks. Note that in the case of cryptocurrencies cartels formation is completely expected — Vlad Zamfir. The History of Casper — Chapter 4.
  • logical decentralization: this concerns the ‘state’ of the system, where ‘state’ means all the data available through the network. The bittorrent platform is logically decentralized, since the network’s data is stored by the peers, each of them storing only a part of the whole available data. The Ethereum network, and DLTs in general, are logically centralized, since it’s desirable that all the peers see a coherent (the same) network state at any time. This coherency comes at the cost of highly redundant data structures and at the course of the CAP theorem: if the system gets partitioned, only one property among consistency and availability can be guaranteed.

Now that we have a good grasp on the meaning of, and problems connected to, (de)centralized networks, we can start to discuss decentralized architectures for energy markets.

More and more renewables are being installed in the distribution grid, especially photovoltaics. Solar panels are highly stochastic energy sources with deep volatility. This volatility calls for an increased flexibility of the demand and creates a sweet spot for the creation of energy communities implementing local energy markets. These markets will have 3 kinds of participants:

  • Local producers, who can sell their excess energy at higher prices
  • Consumers with flexible loads (e.g. water heaters, heat pumps, EV chargers), who can get a discount for load shifting
  • Battery owners, who can sell their storage capability

These three actors could have different motivations in participating to such energy community, among which:

  • A reduction of their electricity bills thanks to the increased self-consumption of the community. The energy communities will also be able to sell their flexibility to distribution system operators, generating additional profits for their members.
  • An increase in the share of locally produced clean energy that they consume.

Let us focus on the specific problem of the choice of the architecture for such a market design.

First of all, we don’t have to start from scratch, the electrical grid is also a network with its own architecture. The existing Infrastructure is massive. It was developed by billions of dollars and can be divided essentially into:

  • Electrical transmission and distribution network (cables, transformers, capacitor banks, FACTS, etc…)
  • Communication infrastructure for metering and control (PLC, optical fibers, etc.)

The network architecture of the electric power grid is also peculiar. In particular, it is divided into different voltage levels. The choice of the voltage level is a function of:

  • The distance to be covered
  • The amount of power that needs to be transported

Per unit of length high voltage lines are more expensive than medium and low voltage ones, but the amount of power they can transport and the distance they can cover is much higher, thanks to lower losses. Indeed, raising the voltage by a factor of 10 reduces the current by a corresponding factor of 10 and therefore the RI² losses by a factor of 100.

In distribution systems, once the electricity approaches the point of consumption the voltage is gradually decreased thereby decreasing the cost of the lines and reducing the possible dangers in case of a short circuit.

Example of transmission grid in southern Switzerland. Red, green, yellow and blue lines represent 380, 220, 150 and 50 kV lines, respectively. Source: AET.

In the above figure, we can see the high voltage (HV) and medium voltage (MV) lines in the region around Hive Power’s headquarter, ranging from 380 kV (red lines) to 50 kV (blue lines).

Example of LV distribution grid, showing a typical radial structure. Source: IEEE.

The low voltage (LV) network, which is much more ubiquitous, like the one shown in the above figure. In the most common case, the topology of the low voltage network is radial, that means it has a simple tree-like structure.

This structure naturally partitions the system. From the physical point of view, the effects of a LV network on the upper MV level, can be taken into account only by means of the total power at the transformer. In other words, it is not required to know the power consumption of all the buildings in the LV network at a given time to effectively control the MV level, nor it is required that a prosumer located in the LV A knows about all the energy produced or consumed by all the prosumers in the LV B to effectively exchange energy.

And now a very important point:

“The mechanism design of new energy markets must explicitly consider the effect of traded energy on the electrical grid. The energy prices must reflect the state of the grid.”

This point is essential for understanding how we view the market and its communication infrastructure. In the presence of distributed generation from renewable energy, e.g. PV, the power production gets highly synchronized. This synchronization is a possible hazard for the electrical grid, since it can overload electrical lines. Furthermore, power production from renewable energies is highly volatile, influencing the local power quality.

New energy markets are in charge of mitigating the effect of an increasing penetration of renewable generators in the electric grid. Common view among the scientific community is that this could be done by means of demand response programs, in which the energy price is changed dynamically, based on the state of the grid.

These considerations lead us to analyze another sub-class of decentralized architectures, which is a very good candidate for decentralized energy markets: hierarchical structures. Hierarchical structures are essentially tree-like structures, in which each node can be a terminal or a branching node. Terminal nodes are the ‘leaves’ of the tree, and have no downwards connections. In our energy markets, terminal nodes are single prosumers.

A tree like structure, in which blue hexagons represent terminal nodes and red hexagons branching nodes. The orange hexagons gather nodes with the same parent nodes into groups.

The picture above depicts an example of a hierarchical structure, in which the blue terminal nodes with a same parent node, are gathered together in a group. Note that this structure is sort of fractal, that is, a group can be seen as a single terminal node when seen from the upper level.

Back to the energy markets and decentralized systems!

How does this architecture fit into the aforementioned classification, and why does it make sense for decentralized energy markets? Let’s reconsider each point one by one:

  • architecture: the architecture is geographically decentralized, but not fully distributed. That is, not all the nodes are equally important, from the point of view of an attacker which would like to make the whole system unavailable. Consider anyway that it is true also for the electric system. Furthermore, and more importantly, remember that the groups are decoupled, physically and logically. This means that, if for some reasons communication with the root node (the one located at the top level), is lost, prosumers in the communities in the lower levels can still effectively trade energy among peers in the same community.
  • political: the hierarchical architecture make the branching nodes pivotal for the energy market to work. This empowers the owners of the branching nodes, with respect to simple prosumers. In order to eliminate this issue, we can introduce a governance system regulated by smart contracts.
  • logical: the hierarchical architecture does not influence logical (de)centralization per se. Anyway, remember that the system we want to operate is decoupled, and physical effects can be taken into account by means of aggregated power on upper levels. That is, both energy trading and grid control are possible if information is aggregated at each branch of the structure. This aggregation would both avoid unnecessary information flows and preserve the privacy of the prosumers: only aggregated information about energy consumption is available at higher levels of the structure; furthermore, even prosumers belonging to the same groups have only aggregated informations about each others.

Hierarchical structures have also another peculiar aspect which is strictly related to mechanism design, a field of economics and game theory which aims at building market rules that induce a desired effect on the market equilibria. For instance, the CASPER protocol of Ethereum is seen by its creators as the result of applying mechanism design to cryptoeconomy.

Designing a market that turns competition into cooperation

One of the most celebrated outcomes of mechanism design is the revelation principle, that states that:

If the market is incentive compatible, we can restrict the study to the situation in which each participant is willing to disclose its private information.

This means that no agents would have incentives in lying about their power forecasts nor expected utility of using a certain amount of energy. Let’s do an example to clarify the implications.

Consider that each market player would adopt an optimal strategy (in terms of outcomes) given his private information. A player could lie in reporting his private information if he finds he has some advantages in doing so. For example, imagine we have designed a market in which prosumers pay a price proportional to their consumption and, if they consume more than the average, they pay an additional fee. If prosumer A declares he’s going to consume a lot of energy in the next market period, the other prosumers could increase their consumption plans, since they believe to be under-the-average consumers. In the next step, prosumer A consumes much less than the one he had previously declared, but there is no time for the other prosumers to synchronize again with updated information. As a result, A is now an under-the-average consumer. What is happened is that A, lying about his private information, has prevented the risk of paying the additional fee, to the detriment of the others.

How can this be avoided? In the simplest form, prosumers (leaf nodes) can agree to communicate their private information to a super-partes entity (their parent node), which would play the optimal strategy for them. Finding the optimal strategy generally involves solving a pre-defined optimization problem. The important thing is that each prosumers has previously agreed on how this optimal strategy is found, and that all of them consider the super-partes entity as trustful. In this case, prosumers have no interests in lying, since doing so they would incur in a payoff reduction, by definition!

In view of the above mentioned benefits, at Hive Power we decided to design our distributed energy market platform making use of aggregators. Of course these aggregators should either be trusted or, even better, auditable.

In my next posts I will discuss how we will:

  • model the market in a dynamic and stochastic setting
  • take into account grid constraints
  • preserve user privacy

I will also discuss alternative solutions for the intra-group communication. Stay tuned!

At Hive Power we are enabling the creation of energy sharing communities where all participants are guaranteed to benefit from the participation, reaching at the same time a technical and financial optimum for the whole community.

Key links:

[maxbutton id=”1″]


HVT Token Model

An overview of the ERC20 token that grants access to the Hive Power ecosystem

The Hive Token (HVT) is a standard Ethereum ERC20 token managed by a smart contract, which gives access to the Hive Power ecosystem and its management.

Token name: Hive Token
Token symbol: HVT
Token type: ERC20
Maximum supply: 100 Millions HVT


Platform access

The main purpose of HVT is to be used for the creation and management of Hives. Hives are distributed energy market platforms implemented in smart contracts, in which registered participants can exchange energy with each other in a cost-effective way.

Hive owners and their market participants will have access to a number of services

  • List of blockchain-ready trusted meters
  • Access to a low cost off-chain payment system based on technology
  • Access to HONEY algorithm
  • Access to forecasting service


Users with staked HVTs will also have access to Hive Power technical governance. More specifically, all the upgrade proposals of the smart contracts will be subjected to their vote. In order to exclude potential speculative actors, only Hive Owners will participate in the governance. Their voting power will be weighted accordingly to staking age.

Stake and burn mechanism

In order to create a new Hive, the future hive owner will need to transfer HVTs in a smart contract, the Beekeeper. The Beekeeper contract performs a “burn and stake” on the received HVTs: 50% of the HVTs are taken permanently out of supply and the remaining 50% are staked inside the contract. The burn mechanism has the purpose to disincentivize unnecessary Hive management operations and reach a stable operation of the Hive Power platform. After the creation of a new Hive, the Hive Owner can send additional HVT to the Beekeeper contract in order to assign meters to the newly created Hive. This operation uses the same “burn and stake” mechanism. When a Hive is destroyed or a meter is detached, the remaining 50% of the HVT tokens is sent back to the hive owner address. The required HVT amount for an Hive creation is variable in order to ensure a stable equivalent cost in fiat. The cost will be decided before the launch of Hive Power 1.0 release.

Hive creation cost

The cost for the creation of a Hive will be a function of the amount of available HVT following the simple rule:


where HVT_a is total amount of HVT which are not staked in the Beekeeper contract or have been burned and N is a constant defining the amount of Hives that can be created using the available HVT tokens. In this way, even if more and more HVT are staked and burned, the cost of a hive in fiat should not increase as a consequence of the increasing HVT scarcity. On the other hand, the number of Hives that can be created with a certain amount of HVT will increase with the number of Hives that have already been created.

The next figure illustrates the evolution of the cost in HVT for the creation of a Hive as a function of the number of Hives already created. In this particular case, the initial price for the creation of a Hive at the moment of the crowdsale is fixed at 50$.

The exact same rule will be applied to the amount of HVT required to add a meter to a hive, which will also be a function of the amount of non-staked HVT. Obviously, the required amount of HVT will be lower than that required for the creation of a Hive.

Join our Telegram Channel:

Key links

Demo Hive: Our First Successful Implementation of a Blockchain-based Energy Market

As a consequence of the foreseen significant increase in stochastic generation in the electrical grid, the need for flexibility and coordination at demand side is expected to rise. Decentralized energy markets are among the most promising solutions allowing to boost coordination between production and consumption, by allowing even small actors to capitalize on their flexibility. The main purpose of Hive Power is to develop a blockchain-based platform to support groups of prosumers that want to create their own energy market. The core element of this framework is the so-called Hive, i.e. an implementation of an energy market based on blockchain technology (see our white paper on to have detailed informations about Hive Power platform).

This article describes Demo Hive, the first testbed developed by our team and presented during the Energy Startup Day 2017 in Zurich, Switzerland on November 30th 2017. Practically, the demo is a simple but also meaningful case of a hive; it is constituted by a producer and a consumer, the so-called workers. A third element is the QUEEN, whose aim is to manage the interaction between the workers and the external grid and to track the measurements related to the power consumed/produced by the workers. The producer, following named SOLAR, simulates a photovoltaic plant with a nominal power of 5 kWp. Instead the other worker (LOAD) generates data about a load consumption. Fig. 1 shows the demo testbed.

Fig1: The Demo Hive testbed

Essentially, the main hardware components of Demo Hive are:

  • two SmartPIs, one for each worker. This device is constituted by an acquisition board for the electrical measurements (voltages and currents) connected to a Raspberry Pi 3. In Fig. 1 the two workers are the black boxes on the bottom.
  • A Raspberry Pi 3 in order to provide the Queen functionalities.
  • A 5G router to provide the Internet connectivity and a WLAN inside the testbed.

Energy tokenization:

One of the most meaningful aim of Demo Hive is to tokenize the produced/consumed energy and to save the related information on a blockchain. For that reason an ERC20-compliant smart contract was deployed on the Ropsten network in order to create a demo token, called DHT, which has the following fixed value:

  • 1 DHT = 1 cts = 0.01 CHF

The basic idea of Demo Hive is that LOAD owns a certain amount of DHTs and sends part of them to the producers (typically SOLAR, but also the external grid through QUEEN) to buy energy. In the following chapter this aspects will be exhaustively described.

Operation mode:

A set of applications runs on the aforementioned devices to actuate the Demo Hive platform, a part of them developed by Hive Power. In this article only the main behavior of the demo testbed will be described, avoiding to explain all the code in details. The following image reports the software interactions inside the demo and outside with the Ropsten network.

Fig 2: Demo Hive software interactions

As written in our whitepaper, periodically the real Hive platform will save data about the tokenized energy on a blockchain. This is quite unconvenient in a demo testbed because the period can be too long. For that reason the demo software considers virtual days with a duration of just 10 minutes. This means the SOLAR worker produces in 10 minutes the same energy really performed in 24 hours. Similarly the power measurements, in a real application performed off-chain and usually acquired every 15 minutes, in Demo Hive are measured every 5 seconds. As shown in Fig. 2, during the virtual day of 10 minutes the power measurements are saved by the workers in QUEEN (black arrows) in an InfluxDB database, a time-series oriented DBMS commonly used in monitoring applications. When the simulated day ends, the workers energies are calculated and tokenized in DHTs considering the following static tariffs.

  • Buy on grid: 20 cts/kWh
  • Sell on grid: 5 cts/kWh
  • Buy in the Hive: 10 cts/kWh
  • Sell in the Hive: 10 cts/kWh

Consider that LOAD/SOLAR worker can only buy/sell energy. Instead QUEEN, managing the interface with the grid, is allowed to perform both the operations. At the end of a simulated day a tokenization algorithm tries to maximize the hive autarky using the following rules (see also Fig. 2):

LOAD buys 𝑬_𝑺𝑶𝑳𝑨𝑹 from SOLAR (10 cts/kWh) and 𝑬_𝑳𝑶𝑨𝑫−𝑬_𝑺𝑶𝑳𝑨𝑹   from QUEEN (20 CHF/kWh)
SOLAR sells 𝑬_𝑳𝑶𝑨𝑫 to LOAD (10 CHF/kWh) and 𝑬_𝑺𝑶𝑳𝑨𝑹−𝑬_𝑳𝑶𝑨𝑫 to QUEEN (5 CHF/kWh)

Practically the workers exchange all the available energy in the hive, exploiting the more convenient tariffs.

Thus, the energies are tokenized in DHTs and the related tokens (as written before, 1 DHT = 1 cts) sent by buyers (LOAD or QUEEN) to sellers (SOLAR or QUEEN) according to the aforementioned algorithm. In Fig. 2 these operations are represented by the red and light blue arrows. The DHTs transfers are then saved on the Ropsten blockchain. This can be performed because on each demo device a geth client maintains a node synchronized to the Ethereum testnet network. In order to minimize the required disk space, the geth instances run the Ethereum light client protocol. The Ropsten accounts of the components are reported below:

Simulation results:

As explained above, the Demo Hive testbed simulates “virtual” days with a duration of 10 minutes. During a single day the produced/consumed power of the two workers is saved every 5 seconds. At the end of the day (i.e. 10 minutes) the related energies are calculated, tokenized and saved on Ropsten network. In order to have days with both the aforementioned cases of the autarky algorithm (i.e. solar production > load consumption and solar production < load consumption) the following power profiles are taken into account for the workers:

  • SOLAR: two profiles are considered, the former (following named CLEAR) with a significant production, related to a day without clouds. Instead the latter (following named CLOUDY) has a poor production, simulating an overcast day. The sequence of the profiles in the simulated days is a continuous alternation, i.e. after a CLEAR day there is a CLOUDY one, and so on.
  • LOAD: a unique typical profile is taken into account as baseline, then every day a noise is added to it. As a consequence, during the simulated days the resulting profiles are always similar, but never equal.

Fig. 3 shows an example of two simulated days. It is simple to note the difference between the CLEAR and CLOUDY cases.

Fig 3: Profiles of two simulated days (light blue: SOLAR, dark yellow: LOAD)

The profiles shown in Fig. 3 were performed during the Energy Startup Day 2017. Considering the first profile (CLEAR), it is simple to understand how the SOLAR production exceed the LOAD consumption. As a consequence, all the energy needed by LOAD is locally bought in the hive from SOLAR producer at the convenient Hive tariff (i.e. 10 cts/kWh). On the other hand, the remaining amount of produced energy not bought by LOAD will be sold by SOLAR on the grid with a less convenient tariff (i.e. 5 cts/kWh). Acting as described, the local energy exchanging is maximized and, consequently, the two workers realize to save/profit money taking advantage of the Hive tariffs.

In the second case (CLOUDY profile), the production is not able to cover all the consumption. Thus, LOAD has to buy part of the needed energy from the grid paying 20 cts/kWh.

At the end of the simulated day the savings/profits data are then tokenized and the related DHTs distributed by the consumer (e.g. LOAD in a CLOUDY case) to the producers (e.g. SOLAR and QUEEN in a CLOUDY case) in order to pay the used energy. In the following list the energy profits/costs in DHTs are reported comparing the cases of Demo Hive against a business as usual (BAU) situation, where the hive market does not exist (i.e. only the grid tariffs, 20/5 cts/kWh to buy/sell energy, are available).

  • Solar revenues:
12:00-12:10 (CLEAR):
HIVE = 432 DHT
BAU = 254 DHT
12:10-12:20 (CLOUDY):
HIVE = 135 DHT
BAU = 68 DHT
  • Load costs:
12:00-12:10 (CLEAR):
HIVE = 356 DHT
BAU = 713 DHT
12:10-12:20 (CLOUDY):
HIVE = 590 DHT
BAU = 725 DHT
HIVE-BAU = -123 DH

It is easy to note how the saved/earned money of LOAD/SOLAR is much higher during the CLEAR day, being the solar production able to cover all the energy needed inside the hive. The following list reports the precise amounts:

  • LOAD saves 3.57 CHF during CLEAR days
  • LOAD saves 1.23 CHF during CLOUDY days
  • SOLAR earns 1.78 CHF during CLEAR days
  • SOLAR earns 0.67 CHF during CLOUDY days

The following URLs report the Ropsten transactions details related to the simulated days.

Next steps:

The Demo Hive testbed implements a very simple case of hive. It is a significant starting point for the development of the complete framework, but some improvements have to to be implemented. The following list reports the most meaningful features still to develop.

  • Prototype of a “blockchain-ready” meter: SmartPi device is based on a Raspberry Pi 3 board, a great hardware platform for prototyping and initial tests but not projected to be easily integrated in an industrial product. In order to develop a blockchain meter, naturally necessary in our framework, the idea of Hive Power is to take into account more industrial-oriented hardware platforms and using them to substitute the SmartPi devices.
  • Power profiles: Currently the workers profiles are quite similar during the “simulated days” of 10 minutes. Practically there is a precise alternation of clear and overcast days for the SOLAR production. Regarding the LOAD, every simulated day a noise is added to the same predefined profile. In order to have a more realistic situation, new profiles have to be considered (e.g. two different LOAD profiles, the former for workdays and the latter related to the weekend)
  • State channels: in our demo testbed, the power measurements are now acquired every 5 seconds and the related data saved in an database running on QUEEN. In order to have a fully decentralized approach, our idea is to handle power data using State Channels technology avoiding to use a local database.
  • More workers: To have a more realistic simulation of a Hive energy market, the number of workers should be increased.
  • Prosumer/Storage worker: Currently being in Demo Hive only a consumer (LOAD) and a producer (SOLAR), it will be meaningful to introduce prosumer and storage workers in order to have a complete market. It is interesting to consider that with storage systems it would be possible to implement load-shifting algorithms to maximize the costs savings.
  • Dynamic tariffs: In Demo Hive only static tariffs are taken into account for the energy buying/selling. Clearly, this is not a realistic situation and consequently a dynamic system of tariffs has to be implemented.
  • World conquering: …is coming 🙂

At Hive Power we are working hard on our demo testbed to continuously improve it and add more functionalities.

Key links:

[maxbutton id=”1″]

Introducing the Hive Power Team


Gianluca Corbellini holds a M.Sc. in Mathematical Engineering from the Politecnico di Milano, focused on mathematical modelling, optimization and artificial intelligence. He has a large experience in multinational corporations in the energy business, having been asset manager for photovoltaic plants and research engineer in the oil and gas industry. In the University of Applied Science and Arts of Southern Switzerland (SUPSI) he is involved in modelling of photovoltaics plants, in the development of new business models for the optimization of smart grids. He was also lecturer for the course “Design of Energy Systems” regarding the design of micro-grids.

Davide Rivola is a senior researcher with a multi-disciplinary micro-engineering background. He is leading the Energy Systems research sector at SUPSI. Before his research activities he gained several years of industrial experience, designing industrial automation systems and developing real-time software for embedded electronics. During the last seven years he researched, developed and trialed in pilot projects fully decentralized energy management systems for self-consumption optimization and grid instability reduction. He is personally involved in blockchain technology since 2013, with an enthusiasm that only grew during time.

Vasco Medici received a M.Sc. in Micro-Engineering from the Swiss Federal Institute of Technology in Lausanne and a Ph.D. in Neuroinformatics from the Swiss Federal Institute of Technology in Zurich. He previously worked in the development of real-time 3D video-based tracking applications. He currently leads the Intelligent Energy Systems Team at SUPSI, where is also teaches the “Introduction to Smart Grid” course. His main competences are system identification, algorithmics, modeling and simulation. He is the coordinator at SUPSI for the Swiss Competence Center for Energy Research on Future Swiss Electrical Infrastructure SCCER FURIES. In close collaboration with industrial partners, his team runs a number of pilot projects in the field of demand side management applied to smart grids.

Lorenzo Nespoli received the M.Sc. degree in Energy Engineering from Politecnico di Milano in 2013. Since 2014 he works on multiphysics simulations and electric grid optimization at SUPSI, where he is lecturer for the “Introduction to Smart Grid” course. He is a Ph.D. candidate at the Swiss Federal Institute of Technology in Lausanne, where he is working on decentralized control algorithms and model-based forecasts for demand side management in the distribution grid, in the context of Swiss Competence Center for Energy Research — Future Swiss Electrical Infrastructure SCCER FURIES.

Davide Strepparava is a researcher of the Intelligent Energy Systems Team at the Institute for Sustainability Applied to the Built Environment at SUPSI. He received a M.Sc. in Computer Science from the Politecnico di Milano. Before his academic activity, he worked for several years in building automation and access control industries. He has a notable experience in data science and database management. In SUPSI he is involved in research projects mainly related to the monitoring of solar plants and smart grids. In the last two years he matured an accomplished experience in blockchain technology, especially focused on Ethereum platform, working on research projects related to decentralized and smart energy markets.

Join Hive Power Telegram chat:

Learn more about Hive Power:

Installing Raiden on a Raspberry Pi 3

The integration of IoT and Ethereum is emerging as a powerful solution for data management using the blockchain technology. Unfortunately, at present the speed and storage requirements of a typical IoT application exceed the capabilities of the public Ethereum blockchain. The reasons are mainly two, the former is the block time, currently too high to be able to track IoT data. The latter is the gas cost: each transaction has a cost on Ethereum, thus the global amount of gas paid for all the transactions would be highly expensive. As a consequence, currently an interface among the “fast” world of IoT and the “slow but decentralized” one of Ethereum is needed.

Raiden ( is a framework for the fast management of transactions. Being built as a off-chain solution, it provides a fast exchange of data among the Raiden nodes using the state channels technology, avoiding the long response time and the gas costs related to on-chain transactions. On the other hand, the starting and the end points of each Raiden state channel are tracked on the blockchain (currently only on Ropsten network) with the related initial and final balances.

In this article the procedure to install Raiden ( on a Raspberry Pi 3 is explained. I chose this well-known hardware platform because it is widely used for IoT applications. It is assumed that Raspbian Jessie 8 is the operating system running on the Raspberry Pi 3.

Raspberry Pi 3 used for the Raiden installation

The installation steps can be summarized into the following list. For each point an explanation is reported together with the related bash commands.

  • Step1: Installation of libraries and tools required by the following steps.
# sudo apt-get install geth pip cmake libboost-all-dev
  • Step2: Creation of a temporary swap file to avoid memory overflows. The available memory in a Raspberry Pi 3 is 1 GB and ~20% of RAM is used by Raspbian and other processes. The remaining RAM is not sufficient for the following compilation processes, so a swap space has to be used. I performed some tests without swap, but I always encountered memory overflows. In the example below I created a swap file of 0.5 GB, enough for the compilations. Consider that after the following two steps the swap file can be deleted and so the space reallocated to the root partition.
# sudo dd if=/dev/zero of=/swap bs=1M count=512
# sudo mkswap /swap
# sudo swapon /swap
  • Step3: Installation from source of the tool Z3 (, required by the solc compiler. On the Raspberry Pi this process was very long, about some hours.
# mkdir ~/software
# cd ~/software
# wget
# unzip
# cd z3-master
# python scripts/
# cd build
# make
# sudo make install
  • Step4: Installation from source of the Solidity compiler (solc). This software is required by Raiden but, unfortunately, currently no binary package is available for the hardware architecture of Raspberry Pi 3 (armv7). This is the cause of the compilations and, definitely, the main reason of this article. Under the computational point of view, also this step has a meaningful duration, although faster than Step3.
# cd ~/software
# git clone --recursive
# cd solidity-0.x.y
# scripts/
# scripts/
  • Step5: Installation of Raiden. The final step is much simpler than Step3 and Step4, being Raiden a set of Python scripts, thus no compilation is required.
# cd ~/software
# git clone
# cd raiden-x.y
# sudo pip install --upgrade -r requirements.txt
# sudo python develop

Now all the tools are properly installed and we can start using them. Currently Raiden network is available only on Ropsten network, thus the first passage is to start geth in light mode (the unique I realized to launch on a Raspberry Pi) and sync it to the test network.

# geth --testnet --light --v5disc --cache 1024 --rpc --rpcport 8545 --rpcaddr --rpccorsdomain "*" --rpcapi "eth,net,web3" console

Once the node is synced, a Ropsten account has to be created and an amout of at least 0.1 ETH assigned to it. After that we can start Raiden as shown below:

# raiden --keystore-path  ~/.ethereum/testnet/keystore

To test if Raiden has been successfully installed, it is possible to query its REST interface with the following command and to check the result:

# curl -X GET

If you find something similar, you have Raiden working on your Raspberry Pi 3, have fun!

At Hive Power we are using blockchain-enabled embedded devices to create energy sharing communities where all participants are guaranteed to benefit from the participation.

Join Hive Power Telegram chat:

Learn more about Hive Power: