Computer Science

Mary Lee Woods: The Programmer Who Helped Teach Machines to Think

Mary Lee Woods was one of Britain’s earliest computer programmers, helping develop software for the Ferranti Mark 1 and shaping the foundations of modern coding. This article explores her pioneering work, her role in early computing, and how her influence helped foster the environment that led to Tim Berners-Lee’s invention of the World Wide Web.

Conway Berners Lee on AI generated backdrop

Conway Berners-Lee: The Engineer Who Wired the World Before the Web

Conway Berners-Lee was a pioneer of early British computing, working on the Ferranti Mark 1, developing foundational programming standards, and shaping the logic and architecture that made modern computing possible. This article explores his life, his engineering legacy, and how his work quietly paved the way for his son Tim Berners-Lee’s invention of the World Wide Web.

The Three Laws of Robotics

Isaac Asimov’s Three Laws of Robotics began as a literary device but grew into one of the most influential ideas in science fiction and AI ethics. This essay explores their origins, how they shaped his stories, their relevance to real-world robotics, and the deeper philosophical questions they raise about safety, freedom, and the rights of future intelligent beings.

Gottfried Wilhelm Leibniz

Leibniz for the 21st Century: Philosophy, Computation and the Human Machine Future

In the 17th and 18th centuries, Gottfried Wilhelm Leibniz pioneered a bold vision of reality as built from ‘monads’ — indivisible, dynamic units whose internal states reflect the entire universe. With his binary arithmetic and formal logic-language proposals, he anticipated key ideas in modern computing, artificial intelligence and systems theory. His philosophical principles — such as the identity of indiscernibles and the principle of sufficient reason — continue to shape debates about machine-reason, human values and the future of interconnected technologies.

Is the Turing test still relevant?

Is the Turing Test Still Relevant?

When Alan Turing proposed his now-famous test in 1950, it was a daring thought experiment: if a human could converse with a machine and not tell the difference, the machine could be said to “think.” For decades, the Turing Test was a beacon — a clear, almost cinematic benchmark for artificial intelligence. But in 2025, …

Is the Turing Test Still Relevant? Read More »

Charles Babbage and Ada Lovelace.

Charles Babbage: The Visionary “Father of the Computer”

Charles Babbage (1791–1871) was a 19th-century English mathematician, engineer, and inventor often hailed as the “father of the computer.” He originated the concept of a programmable, digital computing machine long before electronic computers existed[1]. A true polymath, Babbage designed mechanical calculating engines – most famously the Difference Engine and the Analytical Engine – that are …

Charles Babbage: The Visionary “Father of the Computer” Read More »

Ada Lovelace portrait with logic gate symbols suggested in the background

Ada Lovelace: Prophet of the Thinking Machine

Ada Lovelace is often called the world’s first computer programmer — but she was so much more than that. In an era when women were excluded from scientific circles, she fused mathematical logic with poetic imagination to foresee a future where machines could create, not just calculate. Her vision laid the philosophical groundwork for modern computing and, ultimately, artificial intelligence. This article explores her extraordinary legacy and why her foresight still resonates in our age of rapid technological advancement.