A lot of the dust has settled on PegNet’s specification since my last blog. Mining and burning FCT have been launched with success and the implementation of transactions and conversions is underway, which means I can finally write this blog post about them. We’re also introducing a new way to grade miners, the motivation for which I’ll detail below.
Transactions & Conversions
Transactions and Conversions are very similar and PegNet is using the same chain and data structure to record both. However, we have opted to separate the logic for these. That means Transactions are operations where assets move from one address to a different address, and Conversions are operations where one asset is converted to a different asset … Read the rest
Can’t believe it’s already one year since I started working for Factomize. At the time, I only had a superficial knowledge of blockchains and no experience writing Golang. David’s charge to have me become a Factom Protocol core developer seemed like an almost insurmountable task.
At first, I just worked on the Factomize forum, which was more in line with my area of expertise, while getting familiar with the Factom community and node. A couple of months later it was time to learn Golang and familiarize myself with the core code. My first pull request was on January 7th, 2019, simply adding myself to the factomd CLA. That was followed by my first feature implementation: adding https support to the … Read the rest
In my last blog, I described how the Proof of Work is calculated and how the mining process works, but now I want to dive a little deeper into the concept of proof of work. Most importantly, I want to show the answer to the question: Why does it work?
I’m only going to address the Bitcoin Proof of Work system superficially and unless explicitly stated, all the math and formulas apply to PegNet only.
While I am going to use the term “mining” throughout this blog, it is somewhat of a misnomer because when mining for ore or other substances, there is incremental progress. When you dig a hundred foot long tunnel one day, the next day … Read the rest
PegNet is a decentralized, non-custodial network of tokens pegged (stabilized) to different currencies and assets that allows for trading and conversion of value without the need for counterparties. Built on the Factom Protocol, it is a fully auditable, open source stable coin network using the competition of PoW and external oracles to converge on the prices of currencies and assets. The proposed initial currencies and other assets are:
Over the past couple of weeks, I’ve been involved in the PegNet project and I wanted to share my understanding of what it is and how it works with the rest of the world. I’m a developer and not an economist, so my perspective focuses more on the technical aspects than how to master the market. Due to the large scope of the project, this blog will be split into multiple pieces, with the first one focusing on the Oracle.
PegNet, short for Pegged Network, is a set of tokens pegged to existing currencies. It is built as a Factom Asset Token (“FAT”) standard on top of the Factom Protocol, meaning that the values and transactions sit inside … Read the rest
In my previous blog post on the gossip network, I detailed how the current network has a tendency to form a hub network and how that introduces both inefficiencies and scalability problems. A short recap: when booting up, all nodes connect to the seed nodes, leaving them with a disproportionally vast connection count. This impacts the fanout of messages with the seed nodes receiving a disproportionate amount of messages, the duplicates of which are dropped.
The ideal network structure is every node in the system connected to an equal amount of other nodes. This is made difficult by the fact that nodes are not aware of the network topology.
Living in a world where it’s impossible to tell whether or not a recorded video is real sounds like a nightmare but with the advent of Deepfake, that world has been heralded by many in recent times. The question of what to do about it is asked almost daily in the Factom Protocol community but the answers, both in our community and elsewhere, have been sparse.
Tackling Deepfakes is an extraordinarily difficult problem and, unfortunately, I have no easy answers. I do, however, have some expertise and a lot of interest in the area. The goal of this blog is to present the full scope of the problem, of which Deepfakes is only the latest iteration, and explore the … Read the rest
Up until now, I have been relying on legacy values for configuring the P2P 2.0 package I have been working on. These values are:
As far as I know, these values have been selected arbitrarily with the primary goal of ensuring that messages reach as many targets as possible. The drawback is that the more reliability you choose, the more the network will be flooded with duplicate messages. I wanted to find out if these settings make sense for the network and if it is possible to optimize them.
Since I am a programmer, not a mathematician, I opted to do this through an empirical process.
There is a lot of talk about scalability, sharding, and how to get factomd to the next level. What I want to talk about in this blog is skipping past all those steps in between and start right at the end: a fully customizable, modularized, shardable factom node.
This is not meant to be a proposal of things we should implement in factomd right now, it is an idealistic vision of the future that doesn’t account for hardware limits or optimizations. The haute couture of programming — not something meant to be implemented but rather to inspire goals and trends.
The foundation of extendable modularization is a unified message bus. All modules should be able to react … Read the rest