By using this site, you agree to the Privacy Policy and Terms of Use.
Accept

Your #1 guide to start a business and grow it the right way…

InSmartBudget

  • Home
  • Startups
  • Start A Business
    • Business Plans
    • Branding
    • Business Ideas
    • Business Models
    • Fundraising
  • Growing a Business
  • Funding
  • More
    • Tax Preparation
    • Leadership
    • Marketing
Subscribe
Aa
InSmartBudgetInSmartBudget
  • Startups
  • Start A Business
  • Growing a Business
  • Funding
  • Leadership
  • Marketing
  • Tax Preparation
Search
  • Home
  • Startups
  • Start A Business
    • Business Plans
    • Branding
    • Business Ideas
    • Business Models
    • Fundraising
  • Growing a Business
  • Funding
  • More
    • Tax Preparation
    • Leadership
    • Marketing
Made by ThemeRuby using the Foxiz theme Powered by WordPress
InSmartBudget > Startups > The Subjective Charms of Objective-C

The Subjective Charms of Objective-C

News Room By News Room April 20, 2025 6 Min Read
Share

After inventing calculus, actuarial tables, and the mechanical calculator and coining the phrase “best of all possible worlds,” Gottfried Leibniz still felt his life’s work was incomplete. Since boyhood, the 17th-century polymath had dreamed of creating what he called a characteristica universalis—a language that perfectly represented all scientific truths and would render making new discoveries as easy as writing grammatically correct sentences. This “alphabet of human thought” would leave no room for falsehoods or ambiguity, and Leibniz would work on it until the end of his life.

A version of Leibniz’s dream lives on today in programming languages. They don’t represent the totality of the physical and philosophical universe, but instead, the next best thing—the ever-flipping ones and zeroes that make up a computer’s internal state (binary, another Leibniz invention). Computer scientists brave or crazy enough to build new languages chase their own characteristica universalis, a system that could allow developers to write code so expressive that it leaves no dark corners for bugs to hide and so self-evident that comments, documentation, and unit tests become unnecessary.

But expressiveness, of course, is as much about personal taste as it is information theory. For me, just as listening to Countdown to Ecstasy as a teenager cemented a lifelong affinity for Steely Dan, my taste in programming languages was shaped the most by the first one I learned on my own—Objective-C.

To argue that Objective-C resembles a metaphysically divine language, or even a good language, is like saying Shakespeare is best appreciated in pig latin. Objective-C is, at best, polarizing. Ridiculed for its unrelenting verbosity and peculiar square brackets, it is used only for building Mac and iPhone apps and would have faded into obscurity in the early 1990s had it not been for an unlikely quirk of history. Nevertheless, in my time working as a software engineer in San Francisco in the early 2010s, I repeatedly found myself at dive bars in SoMa or in the comments of HackerNews defending its most cumbersome design choices.

Objective-C came to me when I needed it most. I was a rising college senior and had discovered an interest in computer science too late to major in it. As an adult old enough to drink, I watched teenagers run circles around me in entry-level software engineering classes. Smartphones were just starting to proliferate, but I realized my school didn’t offer any mobile development classes—I had found a niche. I learned Objective-C that summer from a cowboy-themed book series titled The Big Nerd Ranch. The first time I wrote code on a big screen and saw it light up pixels on the small screen in my hand, I fell hard for Objective-C. It made me feel the intoxicating power of unlimited self-expression and let me believe I could create whatever I might imagine. I had stumbled across a truly universal language and loved everything about it—until I didn’t.

Twist of Fate

Objective-C came up in the frenzied early days of the object-oriented programming era, and by all accounts, it should have never survived past it. By the 1980s, software projects had grown too large for one person, or even one team, to develop alone. To make collaboration easier, Xerox PARC computer scientist Alan Kay had created object-oriented programming—a paradigm that organized code into reusable “objects” that interact by sending each other “messages.” For instance, a programmer could build a Timer object that could receive messages like start, stop, and readTime. These objects could then be reused across different software programs. In the 1980s, excitement about object-oriented programming was so high that a new language was coming out every few months, and computer scientists argued that we were on the precipice of a “software industrial revolution.”

In 1983, Tom Love and Brad Cox, software engineers at International Telephone & Telegraph, combined object-oriented programming with the popular, readable syntax of C programming language to create Objective-C. The pair started a short-lived company to license the language and sell libraries of objects, and before it went belly up they landed the client that would save their creation from falling into obscurity: NeXT, the computer firm Steve Jobs founded after his ouster from Apple. When Jobs triumphantly returned to Apple in 1997, he brought NeXT’s operating system—and Objective-C—with him. For the next 17 years, Cox and Love’s creation would power the products of the most influential technology company in the world.

I became acquainted with Objective-C a decade and a half later. I saw how objects and messages take on a sentence-like structure, punctuated by square brackets, like [self.timer increaseByNumberOfSeconds:60]. These were not curt, Hemingwayesque sentences, but long, floral, Proustian ones, syntactically complex and evoking vivid imagery with function names like scrollViewDidEndDragging:willDecelerate.

Read the full article here

News Room April 20, 2025 April 20, 2025
Share This Article
Facebook Twitter Copy Link Print
Previous Article The 3 Biggest Mistakes That Made Me a Better Entrepreneur
Next Article If You’re Using ChatGPT This Way, You’re Doing It Wrong
Leave a comment Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Wake up with our popular morning roundup of the day's top startup and business stories

Stay Updated

Get the latest headlines, discounts for the military community, and guides to maximizing your benefits
Subscribe

Top Picks

Marketing Online Can Be Overwhelming For Small Businesses — But It Doesn’t Have to Be With These 6 Strategies
July 5, 2025
Why I Think More Startups Should Try Rotating Leadership
July 5, 2025
eBay and Vestiaire Collective Want an Exemption from Trump’s Tariffs
July 5, 2025
Former Marine Turns Health Scare Into B2B Wellness Media Startup
July 5, 2025
Netflix teams up with Yahoo DSP as it builds out ads tier
July 5, 2025

You Might Also Like

eBay and Vestiaire Collective Want an Exemption from Trump’s Tariffs

Startups

Venice Braces for Jeff Bezos and Lauren Sanchez’s Wedding

Startups

Cloudflare Is Blocking AI Crawlers by Default

Startups

Substack Is Having a Moment—Again. But Time Is Running Out

Startups

© 2023 InSmartBudget. All Rights Reserved.

Helpful Links

  • Privacy Policy
  • Terms of use
  • Press Release
  • Advertise
  • Contact

Resources

  • Start A Business
  • Funding
  • Growing a Business
  • Leadership
  • Marketing

Popuplar

The $3.1 Trillion in Value Companies Still Overlook
Sisters’ Side Hustle Leads to Hundreds of Millions of Dollars
Venice Braces for Jeff Bezos and Lauren Sanchez’s Wedding

We provide daily business and startup news, benefits information, and how to grow your small business, follow us now to get the news that matters to you.

Welcome Back!

Sign in to your account

Lost your password?