Fabric tools and utilities

Key Points

  1. Caliper useful for performance analysis
  2. Explorer useful for view transaction, writes blocks
  3. CouchDB Mango makes querying Fabric world state easy
  4. LevelDB can be used for the world state DB
  5. Other Fabric tools, utilities listed here 
  6. Carbon Accounting project
  7. Business Partner Agent - exchange master data between partners


References

Reference_description_with_linked_URLs_______________________Notes______________________________________________________________


Hyperledger Fabric Performance
m Hyperledger Deployment


https://github.com/APGGroeiFabriek/PIVT

https://drive.google.com/open?id=1V9GCQBroPWFMz0NN24QS9VFZ4KCit6MM

Kubernetes toolsets for a Fabric network


https://labs.hyperledger.org/labs/convector-framework.htmlConvector tool suite -generate web apps from Fabric smart contracts











Key Concepts


Export Fabric Ledger to a Database tool

https://github.com/kfsoftware/hlf-sync

Hyperledger Fabric stores the information in blocks, but this information is not structured and lacks search/processing capabilities of new databases.

This project aims to store all the information in an OffChain database to access the blockchain data as well as add as a means to see the data for other purposes, such as validating, dashboards, statistics, etc

Pre requisites:

  • A running Hyperledger Fabric network
  • A running supported database

You can download the binary in the release page

hlf-sync --network=./hlf.yaml --config=config.yaml --channel=mychannelname


Configurations for different databases

The configuration file for a postgresql backend

database:
  type: sql
  driver: postgres
  dataSource: host=localhost port=5432 user=postgres password=postgres dbname=hlf sslmode=disable

The configuration file for a mysql backend

database:
  type: sql
  driver: mysql
  dataSource: root:my-secret-pw@tcp(127.0.0.1:3306)/hlf?charset=utf8mb4&parseTime=True&loc=Local

The configuration file for an Elasticsearch backend

database:
  type: elasticsearch
  urls:
    - http://localhost:9200
  user:
  password:


CA key management



Public / Private key management

Private keys are never sent anywhere.  Only public keys are included with transactions.
If you are using the fabric-ca-client or any of the SDKs, by default privates keys are created on the local file system of the host in which enroll.  You can also choose to use the PKCS11 provider to have the private generated and stored in an HSM.
If you do generate it on the local file system, then you should set the permissions to 0400 on *nix based OS’s.  You should also encrypt the file system ( especially when running in a public cloud)

Gari Singh
978-846-7499


Kubernetes toolsets for a Fabric network

https://github.com/APGGroeiFabriek/PIVT









Kafka and Zookeeper nodes need to be persisted

https://stackoverflow.com/questions/50287088/when-using-hyperledger-fabric-with-kafka-consensus-is-persistent-storage-requir/50289394#50289394

You do need to persist the storage for the Kafka and Zookeeper nodes.

For Kafka, you can set the KAFKA_LOG_DIRS env variable and then make sure you attach an external volume to that directory.

For Zookeeper, the default data directory is /data so just attach an external volume to that directory




Chainyard Fabric solution generator from metadata model


https://www.linkedin.com/posts/movee97_hyperledger-fabric-project-accelerator-activity-6780446612964569088-JUuA



Fabex - Fabric Block Explorer Tool - active

https://github.com/hyperledger-labs/fabex

Tutorial on Fabex 

https://vadiminshakov.medium.com/fabex-tutorial-an-introduction-to-the-right-hyperledger-fabric-explorer-cd9ee1848cd9




Business Partner Agent framework for secure master data exchange - active

https://github.com/hyperledger-labs/business-partner-agent

The Business Partner Agent allows to manage and exchange master data between organizations. Exchange of master data should not happen via telephone, excel, e-mail or various supplier portals. Organizations should be able to publish documents like addresses, locations, contacts, bank accounts and certifications publicly, or exchange them privately with their business partners in a machine-readable and tamper-proof format. Furthermore, verified documents, issued by trusted institutions, are able streamline the process of onboarding new business partners.

The Business Partner Agent is built on top of the Hyperledger Self-Sovereign Identity Stack, in particular Hyperledger Indy and Hyperledger Cloud Agent Python.

Current Features

  • Attach a public organizational profile to your public DID (either did:indy/sov or did:web)
  • Add business partners by their public DID and view their public profile.
  • Add documents based on Indy schemas and request verifications from business partners
  • Share and request verified documents with/from your business partners


Blockchain Carbon Accounting in Fabric - active

https://github.com/hyperledger-labs/blockchain-carbon-accounting


Carbon accounting working group

https://wiki.hyperledger.org/display/CASIG/Carbon+Accounting+and+Certification+Working+Group

The mission of this working group is to

  • identify how blockchain or distributed ledger technologies (DLT's) could improve corporate or personal carbon accounting
  • make carbon accounting and certifications more open, transparent, and credible
  • build collaboration between consumers, businesses, investors, and offset developers across industries and national boundaries.

We're here to help 

  • Businesses and organizations take action on climate change by making the process easier and less costly.
  • Certifying entities scale by streamlining the verification of corporate climate action.  
  • General public and consumers trust corporate climate action.
  • Lenders and investors align their capital decisions with climate goals.
  • Offset buyers and developeres connect with each other with greater trust and transparency.

Carbon Offset Guide

https://www.offsetguide.org/understanding-carbon-offsets/what-is-a-carbon-offset/

A carbon offset credit is a transferrable instrument certified by governments or independent certification bodies to represent an emission reduction of one metric tonne of CO2, or an equivalent amount of other GHGs (see Text Box, below). The purchaser of an offset credit can “retire” it to claim the underlying reduction towards their own GHG reduction goals.

emissions channel solution 

https://github.com/hyperledger-labs/blockchain-carbon-accounting/blob/main/utility-emissions-channel/README.md

 implements the Utility Emissions Channel Hyperledger Fabric network in a docker-compose setup and provides a REST API to interact with the blockchain. To see how it works, check out this video.

To calculate emissions, we need data on the emissions from electricity usage. We're currently using the U.S. Environmental Protection Agency eGRID data, U.S. Energy Information Administration's Utility Identifiers, and European Environment Agency's Renewable Energy Share and CO2 Emissions Intensity. The Node.js script egrid-data-loader.js in utility-emissions-channel/docker-compose-setup/ imports this data into the Fabric network.

net emissions token solution

https://github.com/hyperledger-labs/blockchain-carbon-accounting/blob/main/net-emissions-token-network/README.md

The (net) emissions tokens network is a blockchain network for recording and trading the emissions from different channels such as the utility emissions channel, plus offsetting Renewable Energy Certificates and carbon offsets. Each token represents either an emissions debt, which you incur through activities that emit greenhouse gases, or an emissions credit, which offset the debt by removing emissions from the atmosphere.

Read more on the Hyperledger Emissions Tokens Network Wiki page.

To see a demo of how it works, check out this video.

See the documentation for more information and instructions:



DOA - good idea - Convector framework - generate Web apps ( Node.js ) from Fabric Smart contracts 

https://labs.hyperledger.org/labs/convector-framework.html

Convector is a Model/Controller fullstack JavaScript framework to improve and speed up the development experience of clean, scalable and robust Smart Contract Systems. The developer focuses on the EDApps (Enterprise Decentralized Applications) and contractual relationships of participants rather than in lower level blockchain details.

It currently supports Hyperledger Fabric and provides tools to build fullstack TypeScript Smart Contract Systems made up of native JavaScript chaincodes, backend layers (Node.JS), and front end modules (such as AngularJS and React).

Rather than creating new models for chaincode development, it improves the existing development lifecycle on top of Fabric’s models, NodeJS backends, and front end libraries and frameworks by abstracting logic into Models and Controllers, as well providing useful tools for the developer such as local development blockchain network creation, and testing frameworks. The Frameworks also comes with pre built storage and adapter layers to support the basic flow of communication from front end to back end to blockchain, as well CouchDB querying.

Its modular approach aims to be make Convector a cross blockchain framework, making it possible to plug in third party and own made data layers (blockchain, http libraries, etc) and adapters (Fabric’s SDK, CouchDB drivers).

https://github.com/worldsibu/convector


Relationship to Feathers.js concepts




Potential Value Opportunities



Potential Challenges






Candidate Solutions



Debug Fabric instances in Docker Compose

https://medium.com/@rsripathi781/docker-cheat-sheet-for-hyperledger-fabric-128e89f2f36b

Docker Cheat Sheet for Hyperledger Fabric.pdf


docker ps -a

2. Check logs of containers — You might need them to check logs of peer / orderer when you invoke/query chaincodes , joining peer to channels etc..

docker logs containerid

3. Get into docker container — You may need to go into container to explore volumes which you might mounted during container creation. You can get hold of blocks being routed by orderer or explore the ledger stored in peer.

docker exec -it containerid bash

4.Get into Fabric CLI — If you had defined CLI in docker-compose , then this command can take you to the specified peer.

docker exec -it cli bash

5. Restart Container

docker restart containerid

6. Run all services defined in docker-compose

docker-compose -f yourdockercompose.yaml up -d

7. Run specific service defined in docker-compose

docker-compose -f yourdockercompose.yaml up -d servicename

8. Tear down container volumes

docker-compose -f yourdockercompose.yaml down — volumes

9.Force remove all running containers

docker rm -f $(docker ps -aq)

10. Remove all images which where utilized by containers

docker rmi -f $(docker images -q )

11. Remove images which where utilized by containers with like search

docker rmi -f $(docker images dev-* -q)

dev-* — Remove all images which has container name starting with dev



Step-by-step guide for Example



sample code block

sample code block
 



Recommended Next Steps