We are a team of AI & Crypto researchers and engineers (coming from Google DeepMind, top10 Hedge Funds, and pioneering AI/Crypto companies) with the long-term mission of co-creating the full decentralized AI stack in a trustless and permissionless way, with fair rewards for participation and contribution, that advances and benefits all of humanity. Our first focus is on decentralising AI Data Availability and Data Restaking by connecting AI data publishers with data consumers in a decentralized network and marketplace.
Our team is small, operating in a flat structure, highly motivated, and focused on engineering excellence. Eidon is a place for those who like to take ownership, have strong curiosity & do-it energy, appreciate challenging themselves and are ready to work in the intersection of multiple knowledge areas. All engineers and researchers share the title "Member of the Technical Guild."
All employees are expected to be hands-on and to contribute directly to the mission of advancing decentralised AI. Leadership is given to those who show initiative and consistently deliver excellence. Work ethic and strong prioritisation skills are important. We don't spend time with powerpoints and sitting long in meeting rooms.
All engineers and researchers are expected to have strong skills of listening and communicating with clarity and respect. Sharing knowledge openly, concisely and accurately with teammates.
Eidon AI does not have recruiters, PMs or other “people/policy/patrolling” staff. Every application is reviewed directly by a technical member of the team.
We are funded by top tier VCs from the Valley.
Eidon AI is at the forefront of revolutionising the AI data ecosystem through decentralisation. We're seeking Data Engineers who are passionate about building scalable data pipelines and storage solutions in a decentralised environment. You will play a critical role in enabling seamless data flow between AI data publishers and consumers, ensuring integrity, availability, and fairness in data rewards.
Design, build, and maintain efficient, reliable, and scalable data pipelines and storage solutions within a decentralised framework.