Introducing HerePix: My First iPhone App

herepix icon

I did it! I made an iPhone app and officially joined the ranks of millions of developers with an app on the Apple App Store. HerePix: Photos from where you are.

For the past few years, especially when traveling, I kept asking myself, “What photos have I taken here?” Sure, you can dig through your phone to find them, but why isn’t there an app that instantly shows you all the pictures you’ve taken at a specific location? Turns out, some apps do exist, but they were bloated with unnecessary features. I wanted something simpler, more intuitive.

Though I’ve always been comfortable with tech, I never really saw myself as an app developer. I’ve spent more time with scripts and found object-oriented programming challenging—especially back in college when I struggled through Java classes. I had tried making simple apps with tutorials or Swift Playgrounds, but those felt like exercises without a real goal in mind.

That’s when I decided to try a different route. I knew I could figure out app design and tweak code if someone could just give me a solid starting point. After seeing how outsourcing app development worked at my day job, I thought, why not give it a shot for my personal project? I turned to UpWork, a platform similar to Uber or TaskRabbit, where freelancers bid on jobs. I found a developer overseas who agreed to help me with the basic foundation of the app. Within a week, he had provided the skeleton code, and I had an app that did the basics of what I envisioned.

That’s when the real fun began. I started refining the app, working on the branding—enter HerePix, a name I’d been holding onto for years for this exact idea. I dived into learning Swift, UIKit, and building out layouts, focusing on the visual elements and taking the app beyond the initial framework. Today, only about 5% of the code in the released version of HerePix came from the freelancer; the rest has been the result of my time, effort, and plenty of trial and error.

I started testing HerePix with the Xcode simulator, then on my own iPhone. Whenever I’d go to a new place, I’d open the app and verify that all my photos from that location were displayed. I’m happy to say that the app is now everything I wanted it to be—simple, intuitive, and exactly what I’d envisioned.

Of course, this isn’t going to make me a fortune. In fact, it’s a free app that will likely go unnoticed by most people, but that’s not why I made it. This was a passion project, something I did for the joy of creating. And who knows, maybe it will lead to future apps down the road.

So, if you’re curious, give HerePix a try. I’d love to hear what you think, and I had a blast making it!

As of this post, HerePix is available worldwide on the Apple App Store for iPhone running iOS 18.0 or later.

Ray-Ban-dwith: Smart Sunglasses That Might Make You Dumber

ray-ban-meta

Beard Blog Review – Meta Ray-Ban

Ray-Ban Meta Smart Glasses

I’ve always been loyal to my Ray-Ban Wayfarers, even though recently I discovered the Oakley Holbrook and their impressive lenses. Over the past year, I’ve heard bits and pieces about the Ray-Ban Meta glasses but dismissed them due to concerns about Meta’s privacy track record. The idea of a wearable camera from them felt a little… unsettling. Typically, before trying out a new product, I dive into reviews, watch videos from creators, and gather feedback from peers to get a well-rounded perspective. After some deliberation, I finally decided to explore the world of “smart” glasses myself. I opted for a pair of Ray-Ban Meta Wayfarer Polarized in black.

At first glance, they resemble slightly smaller versions of the Ray-Ban Wayfarers I’ve always loved, complete with the familiar rigid brown leather storage case. The fit is a bit different, mainly because of the thicker arms that house all the smart features. After wearing them for the good portion of the day, they start to become uncomfortable. Hopefully I’ll get used to this in time, but more than four hours of straight wear isn’t working right now.

The charging case is one of my favorite features. If you’re familiar with how AirPods work, you’ll find these glasses quite similar. The case has its own internal battery that you charge, and in turn, it charges the glasses whenever they’re stored inside. This ensures that your glasses are likely at 100% whenever you take them out to wear. There’s no need for a separate charger or a bulky dock—just place them in the case, and charge the case itself once a week using USB-C. Where the button snap usually is on a standard Ray-Ban case, there’s an LED indicator that shows the charging status, which is a handy addition. Another perk is that the case looks just like a regular Ray-Ban sunglasses case at first glance, making it less likely to attract unwanted attention or theft.

Glasshole Photography

The standout feature of the Meta glasses, aside from the obvious sun-blocking capability, is the integrated camera. This allows you to capture photos and videos of whatever’s in your line of sight. The camera is an ultra-wide-angle 12MP sensor, which gives images a slight fisheye effect. However, there are no zoom or telephoto options, so you can only capture what’s directly in front of you. A thin button on the top of the right arm is used for capturing media—a single click takes a photo, while a long press starts recording a video. Videos are only 30 frames per second, but you’re not shooting for quality here. Without the LED privacy indicator light next to the camera, it would simply look like you’re adjusting your glasses when taking a photo. To its credit, Meta has made it clear when a photo or video is being captured by incorporating a flashing light to alert people in view. This light remains solid when recording video and cannot be disabled, ensuring others are aware of the camera’s use.

When you take a photo or video, it’s stored on the glasses’ internal 32GB storage until you can import it to your phone. This can be done in two ways: either by connecting to the glasses’ built-in Wi-Fi network, or, when the glasses are in their case, they can connect to known Wi-Fi networks to sync. If the glasses are in the case and connected to a known Wi-Fi network, the media will automatically transfer to your photo library, provided the app is running. If you’re wearing the glasses, you’ll need to open the Meta View app and connect to the glasses for the media to import. This process can interfere with your phone’s Wi-Fi connection, such as when using wireless CarPlay, since the glasses require the phone to disconnect from the car to complete the transfer.

As for the quality of the photos and videos, they are decent considering the source, but don’t match the capabilities of current smartphone cameras. The glasses don’t support HDR or Live Photos, but they do include geotagging if they’re connected to your phone. Media files are saved in a high-efficiency format, allowing you to store over 500 photos or more than 100 videos of 30 seconds each. Interestingly, photos are taken in landscape orientation, while videos are in portrait. This choice was likely made by Meta to optimize sharing on their social media platforms.

The sunglasses also feature built-in speakers that can function as earphones for taking calls, listening to music, and interacting with AI. These speakers are convenient because they allow you to listen to music without needing anything in your ears. Depending on the volume, people around you may not overhear your audio, although they can get loud enough for bystanders to catch some sound. In noisy environments, even at maximum volume, the sound may be hard to hear, which is where noise-cancelling earphones come in handy. The glasses connect to your smartphone like any Bluetooth headphones, so you can use them for any audio from your phone. The Ray-Bans have five built-in microphones, ensuring you’re heard clearly during calls. In my experience, I had no issues with being heard.

What Am I Looking At?

The last feature, and likely what Meta considers the standout one, is the integration of Meta’s AI assistant directly into the glasses, called ‘Meta AI.’ Similar to Siri or Google Assistant, you can ask Meta AI questions, and it will provide audible answers through the speakers. One impressive capability of Meta AI is its ability to describe what you’re looking at. It does this by taking a photo and analyzing it using an AI model. This can help answer questions like “What type of flower is this?”, “Translate this sign to English.”, or simply “What am I looking at?” It performs well, but its effectiveness is somewhat limited by the wide field of view and lack of zoom on the camera, requiring you to get close to objects for accurate identification.

I really enjoy having an AI assistant on my head, eliminating the need to use my phone. Recently, I visited a zoo and could ask it questions about the animals, receiving instant responses through the speakers. You can even continue a conversation with the assistant without saying the wake word (“Hey Meta”) for several replies in a row, enhancing the conversational experience. I did encounter a few hiccups, such as Meta AI occasionally losing its internet connection, requiring me to restart the app to continue. However, some moments made me smile. For instance, when I tried to take a photo, Meta AI immediately said, “I’m unable to take a photo, there is a hat in the way,” and sure enough, my hat was blocking the camera. Another time, in a noisy environment, when I asked a question, Meta AI responded, “It’s too noisy, and I’m unable to hear you,” which I found accurate, as I could barely hear its response. I’d prefer knowledgeable replies like these over generic ones like, “Sorry, I can’t do that…”.

The Good

  • Premium hardware with a subtle design.
  • Functional as regular sunglasses even when the battery is depleted.
  • Innovative smart case and charging design.
  • Meta AI offers practical utility.

Missed Opportunities

  • Limited color and size options.
  • Speakers could be louder or better tuned.
  • Limited iPhone integration, largely due to Apple’s constraints.

The Bad

  • Camera quality falls short of basic smartphones.
  • Photo syncing is cumbersome while wearing the glasses.
  • Touch controls are not always reliable.
  • Less comfortable than traditional Wayfarers.
9 out of 10

How can the Ray-Ban Meta glasses get a 10/10?

  • Improve camera quality and image signal processing to take more feature-rich media.
  • Better integration with smartphone platforms.
  • Better controls and more physical buttons.
  • More brands, colors, and lens options for the smart glasses.

Should you buy Ray-Ban Meta glasses?

Meta’s smart sunglasses offer a stylish blend of premium hardware and innovative features, including the practical Meta AI assistant that provides real-time information and assistance. They function effectively as regular sunglasses even when the battery is low and come with a smart case that makes charging convenient. While the camera quality isn’t quite on par with basic smartphones and touch controls and photo syncing could be improved, the overall design and functionality make these sunglasses a great choice for tech enthusiasts and forward-thinkers who appreciate cutting-edge technology in a sleek, understated package.

iPhone Photography By The Numbers

iPhone photography

As a semi-professional photographer I use multiple tools for my hobby. With an iPhone, I always have a great camera in my pocket. Since 2007 I’ve been taking photos with my iPhone and updating that camera as fast as Apple introduced new features. Because of the ever-changing smartphone camera market I started to track what type of photos I was taking with my primary camera. Now that we have at least four cameras on flagship smartphones I wanted to know even more which lenses I was using and which I didn’t really care for.

I started tracking these numbers in 2018 when I moved from the iPhone X to the iPhone XS. The iPhone XS had a much better camera over the X which was more of an industrial design change than a focus on photography. Each year since, I’ve counted up my photos from the past year and noted which lens was used. Now I can look back and see which phone I took the most photos with and which lens was the most popular that year.

Over the years my overall iPhone camera usage has gone way down since the iPhone X. Not sure why other than the pandemic. The majority of my photos come from the Main/Wide/1x camera but that share has gone down as the cameras have multiplied and offered different perspectives.The first zoom or telephoto lens was added to the iPhone back in 2016 when the iPhone 7 Plus was announced, and since then it’s been a unique addition allowing you to zoom into subjects without digitally cropping.

With the addition of the Ultra-wide camera on the iPhone 11 Pro, a third rear lens was available to split my photography between. It was recently updated to include macro photography on the iPhone 13 Pro, which explains the large bump in share of my photos this past year. In contrast, the lack of progress on the front-facing camera reflects in my reduced use. If you are the type of person that takes a lot of selfies though, that camera will get a lot of work.

Model

Main

Telephoto

Ultra-wide

Front

iPhone X

2702 / 80%

467 / 14%

-

198 / 6%

iPhone XS

2183 / 74%

646 / 22%

-

128 / 4%

iPhone 11 Pro Max

1458 / 64%

369 / 16%

318 / 14%

141 / 6%

iPhone 12 Pro Max

1094 / 67%

218 / 13%

220 / 14%

104 / 6%

iPhone 13 Pro Max

900 / 52%

291 / 17%

444 / 26%

88 / 6%

Totals

8337 / 70%

1991 / 17%

982 / 8%

659 / 5%

Ever year Apple has a story to tell about how the camera is better/different on the new phones. With every upgrade I say I’m going to make a better effort to take more photos but the numbers don’t lie. Over the last 12 months, I only snapped 1,723 photos which was the lowest amount of iPhone photos per year I have record of. To compare, I have saved about 300 photos from my professional camera, Sony a7III, so overall it was a down photos year. I plan to improve on that over the next 12 months.

Check out my iPhone 14 Pro Max review!

Scanning Photographs in 2022

photograph collage

The Inspiration

On the latest episode of one of my favorite podcasts, Reconcilable Differences, John Siracusa explained his latest project where he was scanning old photographs. He recently acquired a new multi-function printer and while testing the quality of the scanner stumbled upon an in-depth project. That got me thinking, ”this sounds like something I might want to do.” John goes on to detail his process and all the drawbacks up to the point of questioning why he is even undertaking this large task.

What’s My Purpose

As a once professional and hobbyist photographer I’ve been taking digital photos since 2002 and have amassed an iCloud Photo Library in excess of 50,000 images. That being said, I do posses some non-digital photographs that I’d like to preserve longer than I feel that I can take care of printed images. For a graduation present, my mother made me a scrapbook of my life thus far through photographs. This is a priceless keepsake that unfortunetly uses original photographs. Again for my 30th birthday she flexed her creative muscles again by making a photo board of more pictures from my first 30 years. She used about 30 original photos on this board and I’ve kept the board around since, because I wanted to keep the photographs it contained. This board was the perfect starting point to test drive a scanning project similar to John’s.

Read more

Low-Light Shootout

Apple iPhone Pro Max vs. Sony a7III

Low-light photography is all the rage now on smartphones. Better sensors and lenses combined with artificial intelligence, machine learning, and other 21st-century buzzwords allow smartphone cameras to actually see in the dark. But how good are they?

Low-light photography has been around for a long time, especially on professional-grade cameras that can hold the shutter open for extended periods of time allowing the sensor to gather as much light as possible.
With the overhauled and widely acclaimed camera system on the iPhone 13 Pro/Pro Max, I thought it would be a good time to see how it compares to a “real” camera.
Using the same tripod and lighting conditions, I captured my backyard with both an iPhone 13 Pro Max and a Sony a7III mirrorless camera.

Let’s start with the iPhone 13 Pro Max, 10-second “Night Mode” capture1

ISO 5000 26mm f/1.5 2.0″ HEIC

You can see there’s not a lot of light here. 10-seconds is a long time to shoot an image handheld, but on a tripod, it’s not nearly long enough.

iPhone 13 Pro Max, 30-second “Night Mode” capture 2

ISO 6400 26m f/1.5 10″ RAW

A lot more light here, almost looks like it’s daylight outside. While the image is visible and you can see details, there’s a lot of noise present and if you zoom in you can see some blurring from noise reduction.

Sony a7III 10-second exposure at f/1.8

ISO 5000 50mm f/1.8 10″ ARW

Much more clearer detail here and has that daylight look like the 30-second iPhone shot. The high ISO gives it a softer look and has some visible noise.

Sony a7III 30-second exposure

ISO 1250 50mm f/1.8 30″ ARW

The ultimate night shot, at 30 seconds the sensor has enough time to capture enough light it can reduce the ISO and crisp-up all the details.
Some might say, “This isn’t a fair fight!”, but if Apple is aiming to take on the prosumer camera market, they need to have benchmarks. This three year old camera shows just how far smartphone cameras still have to go to be competitive.

If you’re looking to get great photos in a dimly lit room, an iPhone with Night Mode will do the job.
If you want to create daylight out of near-pitch-black, get a tripod and a big camera that can suck in all the light available.

Check out my iPhone 13 Pro Max review here (coming soon)

Macro Shootout

Apple iPhone 13 Pro Max vs. Sony a7III

Macro photography is capturing larger than life images to show detail on smaller objects that you normally don’t get to appreciate with the naked eye.

Apple’s most recent flagship phones, iPhone 13 Pro and Pro Max, have a new macro capability that allows the ultra wide lens to focus on objects that are as close as two centimeters. Macro photography is always fun to do especially with insects, flowers, and everyday objects. Never has it been so accessible to a consumer with the object they carry around in their pockets.

I’ve decided to see how well the iPhone 13 Pro Max does against a “big” camera with a cheap macro lens. 

I’m using a 7artisans 60mm f/2.8 lens that I purchased on Amazon, mounted on a Sony a7III, and honestly this lens is hard to use and not intended for a full-frame sensor, but let’s see how it looks.

I found a decorative wool pumpkin decorating our house for fall and through the woven nature would be a great test

Both images were artificially lit with an external LED light source to maximize detail.
You can see how much more detail and clarity comes out of the iPhone lens.

Here’s another comparison of a closer shot.

The iPhone 13 Pro Max macro capability is fantastic. Granted, I could probably match quality with an expensive macro lens on my Sony Mirrorless camera, but now I see no need with the iPhone camera.

Lastly, here’s a small collection of great macro images I captured on the iPhone 13 Pro Max.

 

In the two weeks since the iPhone 13 Pro was released I’ve been taking macro shots of everything I can find and like most new iPhone features I think this will stick around. We’ll see in a year how many macro shots from the iPhone I’ve accumulated.

Check out my iPhone 13 Pro Max review here (coming soon)

Real Photography on the iPhone 6s Plus

As a self-proclaimed photography enthusiast, I’m always trying to get the best shot as well as get my work seen by everyone. I normally do this by dragging my Nikon DSLR camera and several lenses around with me everywhere I go, just in case I see something that needs captured. My thought on great photography is 50% having the right eye to get the shot and 50% being in the right place at the right time. Both of those requirements don’t mention anything about having the best equipment.

A recent business trip took me to Las Vegas and during my free time, I ventured out to Valley of Fire State Park. This is one of the most unique places I’ve ever been. Because of my short trip and hassle of bring multiple bags, I opted not to bring my Nikon DSLR with me on this trip. I do have my new Apple iPhone 6s with me at all times, so I decided this would be a great place to test the newly upgraded camera.

The ability to climb around on rocks, hike down trails, and quickly snap great pictures without having to be mindful of my expensive camera equipment was great! Up until now I have never thought of a phone as a realistic replacement in hopes of capturing great photos. The new Apple iPhone didn’t let me down.

Here are some of the great shots I got on my short trip:

The new camera sensor as well as the optical image stabilization were key to getting great photos from such a compact device.

I’m going to continue to investigate this and might even try bringing both cameras on the next trip.

Shooting Photos on the new Apple iPhone 6s

Brad Magin swapped out his usual DSLR camera for the new Apple iPhone 6s plus on 9/19/2015 and shot some really amazing photographs in various lighting.

The upcoming iPhone 6s is the biggest camera bump Apple has included in a new iPhone. These photos give you a glimpse of what is possible in this next generation device.

Check out this link for the gallery:

http://www.si.com/mlb/photos/2015/09/21/iphone-6s-plus-sneak-preview-photos-si/1