Technology of the smartphone especially the camera _ history for Dummies

Hi,
I am quite sure I will find somebody who can explain some differences between a normal analog camera (apart of the film and sensor), digital camera and a smartphone camera? What are the differences? Which features are still the same ? What kind of technology did travel from the past to current times? And where are we going in the future?

If you have also books for beginners to recommend - happy to learn.
Thank you so much. Do

1 Like

All in all I’d say that when we’re talking about SLRs and other more professional cameras, the process of photography hasn’t massively changed. After all, digital cameras are designed to emulate their analog predecessors, both in operation and results. A benefit of digital is that it’s possible to take many more pictures at no additional cost, thus enhancing the odds of getting a good one - and of course, the fact that photographs can be examined immediately after taking. Other than that, the operation of a digital SLR isn’t massively different from an analog camera - usually a bit easier, what with digital menus and more intelligent functionality.

When it comes to point-and-shoot cameras like the ones inside smartphones, technology has come a very, very long way in a short time. Hardware has been successfully minimised and software has been improved to the point that the teeny tiny camera inside my FP2 can take pictures I couldn’t possibly take on my old Sony point-and-shoot. We are now also squarely in the era of software enhancement: stuff like fake depth of field and neural networks allow us to take pictures that were once impossible to take with similar hardware.

I don’t know enough about cameras to give a concise history and future of photography, though. If you’re interested, there’s a series of recorded lectures on digital photography by Marc LeVoy that I came across some years ago when I was looking for one specific piece of information, but I ended up watching the entire series (though I skipped some of the more technical stuff). Sixteen lectures, over an hour a piece - so quite exhaustive.

It isn’t exclusively about the topic you’re asking about, but it includes the development of photography and differences between analog and digital photography over the course of the lectures. It’s quite interesting and has made me a better photographer, even if I haven’t really held on to all the information about F-stops and apertures.

Something to watch in our current age of quarantine, perhaps!

2 Likes

Your comment is very helpful and I will go for the lessons, thank you. I just found this article, you might be also interested in the future of photography. The smart lenses. I copy you an article I just found. Imagine a contact lens that autofocuses within milliseconds. That could be life-changing for people with presbyopia, a stiffening of the eye’s lens that makes it difficult to focus on close objects. Presbyopia affects more than 1 billion people worldwide, half of whom do not have adequate correction, said the project’s leader, Hongrui Jiang, Ph.D., of the University of Wisconsin, Madison. And while glasses, conventional contact lenses and surgery provide some improvement, these options all involve the loss of contrast and sensitivity, as well as difficulty with night vision. Jiang’s idea is to design contacts that continuously adjust in concert with one’s own cornea and lens to recapture a person’s youthful vision.

The project, for which Jiang received a 2011 NIH Director’s New Innovator Award (an initiative of the NIH Common Fund) funded by the National Eye Institute, requires overcoming several engineering challenges. They include designing the lens, algorithm-driven sensors, and miniature electronic circuits that adjust the shape of the lens, plus creating a power source - all embedded within a soft, flexible material that fits over the eye.

In their latest study, published in Proceedings of the National Academy of Sciences , Jiang and his team focused on a design for the image sensors. “The sensors must be extremely small and capable of acquiring images under low-light conditions, so they need to be exquisitely sensitive to light,” Jiang said.

The team took their inspiration from the elephant nose fish’s retina, which has a series of deep cup-like structures with reflective sidewalls. That design helps gather light and intensify the particular wavelengths needed for the fish to see. Borrowing from nature, the researchers created a device that contains thousands of very small light collectors. These light collectors are finger-like glass protrusions, the inside of which are deep cups coated with reflective aluminum. The incoming light hits the fingers and then is focused by the reflective sidewalls. Jiang and his team tested this device’s ability to enhance images captured by a mechanical eye model designed in a lab.

In separate studies, the researchers have designed and tested a couple of different approaches for the contact lens material. For one approach, they formed a liquid lens from a droplet of silicone oil and water, which won’t mix. The droplet sits in a chamber atop a flexible platform, while a pair of electrodes produces an electric field that modifies the surface tension of each liquid differently, resulting in forces that squeeze the droplet into different focal lengths. The lens is able to focus on objects as small as 20 micrometers, roughly the width of the thinnest human hair.

They developed another type of lens inspired by the compound eyes of insects and other arthropods. Insect eyes comprise thousands of individual microlenses that each point in different directions to capture a specific part of a scene. Jiang and his colleagues developed a flexible array of artificial microlenses. “Each microlense is made out of a forest of silicon nanowires,” Jiang explained. Together, the microlenses provide even greater resolution than the liquid lens. The array’s flexibility makes it suitable not only for contact lenses, but for other potential uses. Wrap it around a laparoscopic surgical scope and you’ve got a high-resolution, 360-degree view inside a patient’s body. Mount it on a lamppost and you can see the surrounding intersection from all sides.

In order to change focus, the contact lens will also need to be equipped with an extremely small, thin power source.

Jiang’s working solution: a solar cell that simultaneously harvests electrons from sunlight, converting them into electricity, and that also stores energy within a network of nanostructures. It works much the way a conventional solar panel does, but the addition of storage capability within a single device is novel, Jiang said. The device still needs tweaking, but the team is optimistic that it will be powerful enough to drive the lens yet small enough to fit the space available.

A prototype for clinical testing may still be five to 10 years off, Jiang said. Once it’s available, however, it may not cost much more than conventional contact lenses. “There’s a huge market for this and with mass production, the cost is not likely to be a barrier,” he said.

1 Like

It is consensus in the internet just as in real life to not just copy and paste text from one place to the other without giving a source.
But in the internet there’s an even easier way to share what you found … as you should give the source anyway, why not just post the source in the first place, e.g.

https://www.nei.nih.gov/about/news-and-events/news/fish-and-insects-guide-design-future-contact-lenses

Interested people can then simply follow the link.
There is, of course, a certain possibility that either such a source address might no longer be available at some point in the future, or the content to be found there might change.

If the text to share would be too important to be subjected to such coincidences, then the sensible thing to do would be to properly quote it instead (with source).

1 Like

You are completely right. Thank you for your comment.
Here the reference of the full article I shared before:
Artificial eye for scotopic vision with bioinspired all-optical photosensitivity enhancer
Hewei Liu, Yinggang Huang, and Hongrui Jiang
PNAS April 12, 2016 113 (15) 3982-3985; first published March 14, 2016

  1. Edited by John A. Rogers, University of Illinois, Urbana, IL, and approved February 5, 2016 (received for review September 9, 2015
1 Like

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.