Category Archives: english

SPIE DCS 2018: CCSI – Computational Imaging

This year I’m chairing the Computational Imaging session at the SPIE Defense + Commercial Sensing, in Orlando, Fla., April 16-19, 2018, together with Aamod Shanker. We have invited a lot of amazing speakers and we are organizing a panel discussion on the trends in computational imaging.

Here’s the program:

SESSION 6 TUE APRIL 17, 2018 – 11:10 AM TO 12:00 PM
Computational Imaging I
[10656-22] “Ultra-miniature…”David G. Stork, Rambus Inc. (USA)
[10656-36] “Computed axial lithography: volumetric 3D printing of arbitrary geometries” Indrasen Bhattacharya
Lunch/Exhibition Break Tue 12:00 pm to 1:50 pm

SESSION 7 TUE APRIL 17, 2018 – 1:50 PM TO 3:30 PM
Computational Imaging II
[10656-24] “Terahertz radar for imaging…”Goutam Chattopadhyay
[10656-23] “Computational imaging…” Lei Tian
[10656-26] “Achieving fast high-resolution 3D imaging” Dilworth Y. Parkinson
[10656-27] “Linear scattering theory in phase space” Aamod Shanker

PANEL DISCUSSION TUE APRIL 17, 2018 -4:00 PM TO 6:00 PM

TUESDAY POSTER SESSION TUE 6:00 PM TO 8:00 PM

SESSION 8 WED APRIL 18, 2018 – 8:00 AM TO 10:05 AM
Computational Imaging III
[10656-28] “High resolution 3D imaging…” Michal Odstrcil
[10656-29] “A gigapixel camera array…” Roarke Horstmeyer
[10656-30] “EUV photolithography mask inspection using Fourier ptychography” Antoine Wojdyla,
[10656-31] “New systems for computational x-ray phase imaging…” Jonathan C. Petruccelli,
[10656-68] “Low dose x-ray imaging by photon counting detector”, Toru Aoki

Continue reading

Art and science (IX) – Neural networks

This is a continuation of a series of blog posts, written mostly in French, about arts and science

In the past few years, we’ve seen the emergence of Deep Neural Networks (DNN), and the latest developments are Generative Adverserial Networks (GAN), where the goal is to pit two neural networks against each other so that they find the best way to generate an object from a label or a simple drawing, or mimick the style of an artist.

The first ripple in the vast ocean of possibility was Deep Dream, though it wasn’t technically a GAN:

Now, things have evolved even more, and you can not only generate trippy videos, but also use neural network to emulate the style of an artist and generate from scratch content that is indeed appealing!

Continue reading

Oasys

With increasingly tight beamline specifications, optical modeling software becomes necessary in order to design and predict the performances of conceptual beamlines. This becomes particularly true with the advent of highly coherent light sources (such the proposed upgrade of the ALS), where additional considerations such mirror deformation under heat load and effects of partial coherence needs to be studied. Luca Rebuffi will present the latest features of OASYS/Shadow, an optical beamline modeling tool widely used in the synchrotron community and show how to get started with beamline simulations.

Program: Continue reading

Self-reference

Self-reference is cornerstone in Hofstadter’s Godel-Escher-Bach, a must read book for anyone interested in logic (and we shall rely logic in these days to stay sane.)

Here’s a bunch of examples of self-reference that I found interesting, curated just for you!

Barber’s paradox:

The barber is the “one who shaves all those, and those only, who do not shave themselves.” The question is, does the barber shave himself?

Self-referential figure (via xkcd):

Tupper’s formula that prints itself on a screen (via Brett Richardson) Continue reading

Moore’s wall

A single chip such has Intel Xeon Phi has a computational power in excess of 1TFLOPS and features more than a hundred billion transistors. Few people  outside the world of semi-conductor engineering appreciate this, but that is a fantastical number: 100,000,000,000. If every transistor was a pixel, you would need a wall 0f 100 x 100 4K TV screen to display them all!

Over the past fifty years, the semiconductor industry has achieved incredible things, in part thanks to planar technology, which allowed to exponentially scale the manufacturing process, following Moore’s law. But it seems that we’re about to hit a wall soon.

faith_no_moore

Let’s give an overview of where we stand, and where do we go from here!

Continue reading

Quick’n’dirty

Over the years I’ve collected quotes from people who are.

I always like quotes, because they are atoms of knowledge, quick and dirty ways to understand the world we only have one life to explore. To some extent, they axioms of life in that they are true and never require an explanation (otherwise they wouldn’t be quotations.)

Here’s a bunch of quotes that I found particularly interesting, starting with my absolute favorite quote comes from the great Paul Valery:

The folly of mistaking a paradox for a discovery, a metaphor for a proof, a torrent of verbiage for a spring of capital truths, and oneself for an oracle, is inborn in us. – Paul Valery

On research — trial and error

Basic research is like shooting an arrow into the air and, where it lands, painting a target.
-Homer Burton Adkins

A thinker sees his own actions as experiments and questions–as attempts to find out something. Success and failure are for him answers above all.
– Friedrich Nietzsche
Continue reading

Learning Deep

In the past four years, there’s been a lot of progress in the field of machine learning, and here’s a story seen from the outskirts.

Eight years ago, for a mock start-up project, we tried to do some basic headtracking. At that time, my professor Stéphane Mallat told us that the most efficient way to do this was the Viola-Jones algorithm, which was still based on hard-coded features (integral images and Haar features) and a hard classifier (adaboost.)

(I was thrilled when a few years later Amazon Firephone was embedding similar features; unfortunately, this was a complete bomb — better technologies now exist and will make a splash pretty soon.)

By then, the most advanced book on machine learning was “Information Theory, Inference, and Learning” by David McKay, a terrific book to read, and also “Pattern Recognition and Machine Learning” by Chris Bishop (which I never read past chapter 3, lack of time.)

Oh boy, how things have changed!

Continue reading

Books I loved

With every year comes the occasion to read new books!

I’ve assembled a small collection of books that I love, so that you can discover them and share the love around! Since there are twelve months in a year, you’ll find twelve books. They are presented in no particular order, so that you can enjoy them at random, sitting in a couch sipping some wine.

martin-franck-cartier-bresson

Martine’s legs – Henri Cartier-Bresson (1967)

Ringolevio (Emmett Grogan)

Ringolevio is a sort of autobiography by Emmett Grogan, a leader of the Diggers in San Francisco just about when it was becoming cool (early 60s). It is a great book in that it is written with a punch, and has a deep sense of social awareness. It is quite fun to read Timothy Leary and other fake-prophets of the revolution getting thrashed.

Emmett wondered whether anything viable was going to come out of it: whether the powerless might for once obtain enough power to make some sort of relevant change in the society. He immediately dismissed as ridiculous the notion that everything would be all right when everyone turned on acid. It was noted that LSD was used during World War Two to solve naval tactical maneuvers, and they concluded that although the drug might facilitate understanding or the process of doing something, it offered no moral direction or imperatives.

There is no road (Antonio Machado)

This is my absolute favorite poem books. It is very short, and has the deepest thoughts ever assembled in a book. This book is a treasure, and I have offered it to people I care about. This book is often out of print, but don’t settle for a different collection, this one is really unique if you can find it, and by far the best translations I’ve found.

Between living and dreaming there is a third thing. Guess what it is
– Antonio Machado

Continue reading

Julia Language

A few years ago, I got interested in the then-nascent Julia language (julialang.org), a new open source language based on Matlab syntax with C-like performances, thanks to its just-in-time compiler.

Large Synoptic Survey Telescope (LSST, Chile) data being processed with Julia on super computers with 225x speedup

Large Synoptic Survey Telescope (LSST, Chile) data being processed with Julia on super computers with 200x speedup (from https://arxiv.org/pdf/1611.03404.pdf)

It now seems that the language is gaining traction, with many available packages, lots of REPL integration (it works with Atom+Hydrogen, and I suspect Jupyter gets its first initial from Julia and Python) and delivering on performances.

Julia is now used on supercomputers, such as Berkeley Lab’s NERSC, taught at MIT (by no less than Steven G Johnson, the guy who brought us FFTW and MEEP!), and I’ve noticed that some of the researchers from Harvard’s RoLi Lab I’ve invited to SPIE DCS 2018 are sharing their Julia code from their paper “Pan-neuronal calcium imaging with cellular resolution in freely swimming zebrafish“. Pretty cool!

Julia used for code-sharing in a Nature publication. I wish I could see that every day!

Julia used for code-sharing in a Nature publication. I wish I could see that every day!

I got a chance to attend parts of Julia Con 2017 in Berkeley. I was amazed by how dynamic was the the community, in part supported by Moore’s foundation (Carly Strasser, now head of Coko Foundation), and happy to see Chris Holdgraf (my former editor at the Science Review) thriving at the Berkeley Institute for Data Science (BIDS).

Julia picking up speed at Intel (picture taken dusing JuliaCon 2017)

Julia picking up speed at Intel (picture taken dusing JuliaCon 2017)

I started sharing some code for basic image processing (JLo) on Github. Tell me what you think!

(by the way, I finally shared my meep scripts on github, and it’s here!)

Soft power

During the past twelve months, I become enamored with Middle-Eastern music who now wears many clothes, from electro to Queen-esque Arabic music.
Since it’s Thanksgiving, here are some offerings!

It seems that the best way to fight against the rampant islamophobia is to emphasize the beauty of the culture, and blend it with modern tunes. Youth will follow, and eventually replace the old patriarchy that has plagues so many Muslim countries.

Never underestimate the healing power of music
– Mark Kozelek/Sun Kil Moon

Acid Arab is a French DJ band who mixes Arab tunes with electronic music. Their mix at the Sonar 2016 is an absolute masterpiece, blending fantastic rythms with melodies rarely heard in electronic music.

Mash’rou Leila is a Lebanese band, extremely popular in the Middle-East. It is very varied, and the singer Hamed Sinno has a Freddie Mercury-like persona, and brings lot of poetry. Shim El Yasmine is a song that talks about the odor of the jasmine of his partner at the time, who went on leave him to marry a woman, because society wouldn’t understand that love is love…

A week before I went to see them live at Slim’s in San Francisco, the band had been banned from el-Sisi’s Egypt, where people in the audience raised rainbow flags… Continue reading