Faster fourier transform named one of worlds most important emerging technologies
Bob Lawrence (VE1RLL)
Posts: 1,720
http://web.mit.edu/newsoffice/2012/faster-fourier-transform-named-one-of-worlds-most-important-emerging-technologies.html
Earlier this year, Professors and CSAIL Principal Investigators Piotr Indyk and Dina Katabi, along with CSAIL graduate students Haitham Hassanieh and Eric Price, announced that they had improved upon the Fourier transform, an algorithm for processing streams of data. Their new algorithm, called the sparse Fourier transform (SFT), has been named to MIT Technology Reviews 2012 list of the worlds 10 most important emerging technologies.
With the SFT algorithm, streams of data can be processed 10 to 100 times faster than was possible before, allowing for a speedier and more efficient digital world.
We selected the sparse Fourier transform developed by Dina Katabi, Haitham Hassanieh, Piotr Indyk and Eric Price as one of the 10 most important technology milestones of the past year because we expect it to have a significant impact, said Brian Bergstein, Technology Reviews deputy editor. By decreasing the amount of computation required to process information, this algorithm should make our devices and networks more powerful.
Each year, the editors of MIT Technology Review select the 10 emerging technologies with the greatest potential to transform our world. These innovations promise fundamental shifts in areas including energy, health care, computing and communications. The SFT is featured in the May/June edition of Technology Review and is posted on the web at http://www.technologyreview.com/tr10/.
Earlier this year, Professors and CSAIL Principal Investigators Piotr Indyk and Dina Katabi, along with CSAIL graduate students Haitham Hassanieh and Eric Price, announced that they had improved upon the Fourier transform, an algorithm for processing streams of data. Their new algorithm, called the sparse Fourier transform (SFT), has been named to MIT Technology Reviews 2012 list of the worlds 10 most important emerging technologies.
With the SFT algorithm, streams of data can be processed 10 to 100 times faster than was possible before, allowing for a speedier and more efficient digital world.
We selected the sparse Fourier transform developed by Dina Katabi, Haitham Hassanieh, Piotr Indyk and Eric Price as one of the 10 most important technology milestones of the past year because we expect it to have a significant impact, said Brian Bergstein, Technology Reviews deputy editor. By decreasing the amount of computation required to process information, this algorithm should make our devices and networks more powerful.
Each year, the editors of MIT Technology Review select the 10 emerging technologies with the greatest potential to transform our world. These innovations promise fundamental shifts in areas including energy, health care, computing and communications. The SFT is featured in the May/June edition of Technology Review and is posted on the web at http://www.technologyreview.com/tr10/.
Comments
It seemed that she knew far more than I did. She worked in England during World War II for the government in calculating FFTs manually as part of the war effort against German U-boats.
There are tons of applications of FFTs that we take for granted these days, but back then it all had to be done with pencil and paper. And even then, it was considered fast.
I wonder, could this be used for "spare video" to use less memory?
That is interesting because it is generally accepted that the Fast Fourier Transform was invented/discovered by Cooley and Tukey in 1965. Long after the second world war. Only much later still was it found that Gauss had the same idea around 1800 something.
So either that lady was computing the FFT which the British war effort knew about but kept secret or she was computing Fourier transform the old long way (DFT).
Interestingly the girls, usually, that were employed to perform such manual calculations were known as "computers". How they did all that, without necessarily understanding what they were computing, without going nuts is a mystery to me.
Except it was not, as it was unknown at the time.
Prof_Braino,
Sadly the FFT for dummies came to a bit of a halt. I'm still trying to find a way to present the idea without getting bogged down in the maths too much.
As I have said here a few times it took me half my life to understand the FFT well enough to implement my own from scratch (without reference to an existing in code) in C, Spin and PASM. I suspect my remaining life span is not enough to accommodate this new version:(
What they seem to be saying is that any given meaningful signal contains a lot less variation than random samples. Therefor there is a lot of calculation in getting the transform that you don't need to do. The "sparse" FFT. So how on earth do they decide what is be included and what is important?
This is a form of compression and is surprising as normally we take the transform first, then throw away the lesser amplitude frequencies and then we can Huffman encode or otherwise compress the result. As in MP3 or JPEG.