Applications Google
Menu principal

Post a Comment On: Backreaction

"Big data meets the eye"

15 Comments -

1 – 15 of 15
Blogger Arun said...

Not really related, but here is the Angelina Jolie curve:

http://www.wolframalpha.com/input/?i=Angelina+Jolie+curve&lk=1&a=ClashPrefs_*PlaneCurve.AngelinaJolieCurve

6:51 AM, November 07, 2013

Blogger Harbles said...

I assume the scientific community saw this.
http://www.designboom.com/technology/elon-musks-tony-stark-3d-interface-for-designing-spacex-rockets/
Any application to your topic above? Does one still need to know a bit about what one is looking for before you can decide how to present the data to be visualized in 3D space?

6:51 AM, November 07, 2013

OpenID muon said...

Nice article (in fact I have enjoyed several of your latest articles). Although I am an experimentalist, not a theorist, it is obvious that we need theorists and the proposition that theory is little more than finding patterns in sets of numbers absurd. The laws of physics as we learn them in school are the real point of physics and they tell us much more than "there is a pattern in a specific data set."

Anyway, the fact that modern and capable visualization tools play such an important role in some theoretical research is intriguing. I don't think this is the case for particle physics, but could it be?

8:34 AM, November 07, 2013

Blogger Plato Hagel said...

One-Loop Calculations with BlackHat

Just write a better algorithm for a search feature with increased data like they did above?

Data transfer is big money, and most people have been sucked in by it's metered use? If you want imaging and you like to watch movies maybe the transfer rate can be accommodated? :)

Best,

9:32 AM, November 07, 2013

Blogger Plato Hagel said...

Hmmmm.....got me thinking.:)


Architecture of Trigger System

Worldwide LHC Computing

9:38 AM, November 07, 2013

Blogger Plato Hagel said...

Okay last post.:)

"That his family could watch his dissertation defense over streaming media illustrated to him the usefulness of visualization: “What we are discovering about DNA is easy to grasp when you can see it,” says Freeman. “We can now fully illustrate, through computation, how DNA interacts with other entities such as proteins within a cell. We can show the public what our research looks like.” Freeman pointed out that what they discover through computation (in silico), they always confirm in the real world (in vivo) – but he says that computation should matter to the general public because it enables researchers to study a wide range of interactions between key biological molecules in an inexpensive manner. It speeds up drug discovery. Thus, funding science is extremely valuable to everyone." Using the OSG to simulate DNA-protein interaction

9:53 AM, November 07, 2013

Blogger Uncle Al said...

The simple solution: an algorithm that emulates pareidolia. Sounds like a neural net job. What content does a non-random image possess?

" non-centrosymmetric crystals which have a relativistic spin-splitting of the conducting bands." Heavy atom semiconductor tellurium crystallizes in enantiomorphic space groups P3(1,2)21. Piezoelectric crystals are non-centrosymmetric insulators (though they may have mirror planes).

http://www.sciencemag.org/content/317/5842/1196.abstract
http://arxiv.org/abs/1304.2970
http://www.nature.com/news/interface-superconductivity-found-in-single-crystal-1.13815

http://www.calculushumor.com/3/post/2012/11/the-batman-curve.html
The power of seeing stuff
http://mathworld.wolfram.com/BatmanCurve.html
The power of knowing stuff

11:12 AM, November 07, 2013

Blogger Chris Kennedy said...

Nice Batman Curve. Sure beats the almost Superman curve you get with a weak acid titration.

12:30 PM, November 07, 2013

Blogger deepak said...

Dear Bee,

You state: "One still needs the algorithm that is able to find patterns. And for that algorithm, one needs to know what one is looking for to begin with." ... Not necessarily. First there is Occam's razor which tells us that we should look for the simplest possible model. Then there is Balasubramanian's work relating Occam's razor and statistical mechanics (http://arxiv.org/abs/cond-mat/9601030v1). Essentially, one can construct a partition function describing the dynamics of "agents" which explore some parameter space and eventually settle down when the find a minima, which often corresponds to the simplest possible model.

Even more interestingly, very recently Jonathan Heckmann has argued that one can derive string theory from a similar line of reasoning involving agents exploring a parameter space - which happens to correspond to the target spacetime of a string. The agents correspond to the points on the string.

Of course, that does not do away entirely with the need for some sort of intelligent inferential systems - such as humans - to make sense of big data, but it does bolster Chris Anderson's claim.

Fascinating article nevertheless.

Cheers,

Deepak

5:55 AM, November 08, 2013

Blogger Sabine Hossenfelder said...

Harbles,

That is pretty damn cool :) I think though the data manipulation that they worked with was less fancy. But, yeah, I guess that's the future. I mean, we've all gotten pretty used to zooming with our thumb and index finger, no? I'm always pissed off though I can't do it with the index and middle finger instead. Best,

B.

9:37 AM, November 08, 2013

Blogger Sabine Hossenfelder said...

Muon,

Well, particle physics illustrates the problem with the 'end of theory' idea. If you got not theory, you don't know what to look for in the data. I mean, you can generically look for deviations from predictions of the theory you have, but that only gets you so far. Typically you'll have to look for some signature that fulfills different requirements to get a significant result, and the different requirements necessitate that you have a theory that tells you they should occur together. To drive it to an extreme. Imagine you had all the LHC data and no theory at all. Do you think we'd be able to arrive at correlation tables equally good as the standard model? The answer is almost certainly no. Even if you could correctly extract all correlations, it would be a terribly slow and clumsy operation and I can't for the hell of it see how it would be good enough to come up with higher order corrections that might be testable in the future.

(All this btw is not to say that it is not possible in principle, just that it isn't possible in practice, not now and not any time soon.) Best,

B.

9:45 AM, November 08, 2013

Blogger Uncle Al said...

h-Index studies raise the h-indices of studyiers. Do orbits earn Frequent Flyer miles?

http://www.nature.com/news/divinations-of-academic-success-may-be-flawed-1.14113

A single failed reaction is a setback, a million failed reactions are a combinatorial library. Whatever its virtues, Big Data will "mature" into fashionable snipe hunts - then to be studied. Science was once performed not administered. 1) Collect frightening savants, dump them in back rooms, toss in raw meat. 2) Return to your desk, put your head in your hands, sweat blood. 3) Miracles occur. 4) Go in back, hose off the responsible lab apes, confiscate the wonders.

Science is now defined as formalized mediocrity plus retaliation by codes of conduct. The future was meant to be dangerous. A doormat has a much larger surface area than an ice pick. So what?

11:36 AM, November 08, 2013

Blogger Zephir said...

The role of reality in science: The universe's most powerful enabling tool is not knowledge or understanding but imagination because it extends the reality of one's environment.

5:58 AM, November 09, 2013

Blogger Sabine Hossenfelder said...

John Horgan has an interesting post at SciAm blogs about big data and science:

"[I]n this post I’ll suggest that Big Data might be harming science, by luring smart young people away from the pursuit of scientific truth and toward the pursuit of profits."

Are “Big Data” Sucking Scientific Talent into Big Business?


(And I always forget that data is a plural.)

8:34 AM, November 09, 2013

Blogger GSC said...

Chris Anderson prefaces his Wired article with statistician George Box's pithy saying "All models are wrong, but some are useful" - which is true enough. But he should have thought about it a bit more - he might then have discovered the next level of that idea, which may be: "And some models are actually absurd", which nicely describes Anderson's claim about "The End of Theory"

GSC

11:53 PM, November 10, 2013

You can use some HTML tags, such as <b>, <i>, <a>

Comment moderation has been enabled. All comments must be approved by the blog author.

You will be asked to sign in after submitting your comment.
OpenID LiveJournal WordPress TypePad AOL