Several companies like Circuitmind, Cells and JITX are starting to use AI to do hardware design.
I like this paper on generating 3D voxels based objects:
As compared to polygon based models, I think voxels are a more accurate way of modeling actual 3D objects.
Also seems closer to how 3D printing would work.
ZK proofs are a better solution for ensuring trust for all kinds of contractual and financial interactions.
Today, I wanted to cover some basics of cloud storage. So let me casually throw away the most basic definition first.
A web browser is old software, with lineages tracing back to the 1990s. It’s our portal to the cloud, interface with Salesforce, AWS, Azure, and countless other services.
It’s also one of the most insecure apps you will ever use, with malicious attacks targeting the browser.
In our last post, we explored generalised and self-attention. So now let's move forward and learn about more types of attention mechanisms.
(In case you missed part 1)
Attention mechanisms differ based on where the particular attention mechanism or model finds its application. Another distinction is the areas or relevant parts of the input sequence where the model focuses and places its attention.
To understand attention mechanism, let us first take a look at the concept of attention.
While exploring the terms GPT-3 is built out of, we started by learning about generative models. So now let’s understand what are pre-trained models.