It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

CERNs LHC open Data

page: 1
13

log in

join
share:

posted on Jan, 14 2020 @ 09:14 PM
link   
So once again, for those that claim science is some kind of blackbox, closed off to the layman, in which people in labcoats sit in darkened rooms doing unspeakable secret experiments... well, those people will be putting out not only huge data sets, but also a huge amount of the knowledge base on how they analyzed the data.

So thats data, and the insides of the hardcore statistical analysis for everyone to look at, and do things with should they desire. Now obviously this isn't raw data, for that you'd need grid certificates for access to many petabytes of data, and needless to say roaming through all of that data requires thousands of CPU hours... So in many ways, this is the best you are gonna get.

home.cern...

this isn't the first time an LHC experiment has opened up its data... odd how, despite the data being out there for people to look at, people continue to make silly claims on 'how science works' when it comes to places like the LHC and CERN



posted on Jan, 15 2020 @ 06:38 AM
link   

originally posted by: ErosA433
So thats data, and the insides of the hardcore statistical analysis for everyone to look at, and do things with should they desire. Now obviously this isn't raw data, for that you'd need grid certificates for access to many petabytes of data, and needless to say roaming through all of that data requires thousands of CPU hours... So in many ways, this is the best you are gonna get.
Even CERN can't save all the raw data. According to their website, they generate one petabyte of collision data per second, and they only end up saving about one petabyte per day after filtering/processing. The amount of raw data is mind boggling.

CERN Data Centre

Particles collide in the Large Hadron Collider (LHC) detectors approximately 1 billion times per second, generating about one petabyte of collision data per second. However, such quantities of data are impossible for current computing systems to record and they are hence filtered by the experiments, keeping only the most “interesting” ones. The filtered LHC data are then aggregated in the CERN Data Centre (DC), where initial data reconstruction is performed, and where a copy is archived to long-term tape storage. Even after the drastic data reduction performed by the experiments, the CERN DC processes on average one petabyte of data per day.


The article you linked describes an additional filtering process to get the data down from 1 petabyte a day to something more manageable. When I looked into this a while back to try to help Delbert Larson figure out how to test his theory against CERN data, the publicly available data was down to I think about 300 terabytes. I couldn't find a total data size for this latest release you linked to but if I had to guess I'd think it might be ballpark 500 terabytes available

You apparently don't need all 500 terabytes at once, they have it broken down by type of research. This is not exactly what they had when I looked into getting their data before, this is a new and improved upgrade in what they make publicly available, so thanks for posting the link as I wasn't aware their data offering had been upgraded.

One question that popped into my mind when reading the article, which cited a quote from "Laura Jeanty, ATLAS Supersymmetry working group convenor", is whether their data shows that "supersymmetry is dead", since I have seen various opinions that may be the case. But Laura apparently hasn't changed her title yet, so how dead can supersymmetry really be?



posted on Jan, 15 2020 @ 02:21 PM
link   
Yeah thats right, you wont need the whole dataset, and what is presented will be calibrated reduced data... and when i say reduced is that there will be some basic cuts to the data to remove noise or known detector effects or other issues.

Specific physics is also going to occur at specific energy ranges so i imagine some of the datasets will be in gradients of energy ranges. The other will be events of different kinematic types, such that beampipe boosted, or zero z momentum. etc

I think the other set was LHCb... or CMS who put a data set out, this set is from ATLAS so thats really cool to be able to contrast and compare.
Oh often these titles are there for the remainder of an experiment, even if the group or activities of said group are minimal. I think the interesting thing about supersymmetry is that, its basically not really ruled out, it is just harder to explain it being there from the data we have, over the energy ranges available. The other issue is that it not being present at the energy reach of the LHC, makes some of the dependant models somewhat screwy



posted on Jan, 15 2020 @ 02:37 PM
link   

originally posted by: Arbitrageur

originally posted by: ErosA433
So thats data, and the insides of the hardcore statistical analysis for everyone to look at, and do things with should they desire. Now obviously this isn't raw data, for that you'd need grid certificates for access to many petabytes of data, and needless to say roaming through all of that data requires thousands of CPU hours... So in many ways, this is the best you are gonna get.
Even CERN can't save all the raw data. According to their website, they generate one petabyte of collision data per second, and they only end up saving about one petabyte per day after filtering/processing. The amount of raw data is mind boggling.

CERN Data Centre

Particles collide in the Large Hadron Collider (LHC) detectors approximately 1 billion times per second, generating about one petabyte of collision data per second. However, such quantities of data are impossible for current computing systems to record and they are hence filtered by the experiments, keeping only the most “interesting” ones. The filtered LHC data are then aggregated in the CERN Data Centre (DC), where initial data reconstruction is performed, and where a copy is archived to long-term tape storage. Even after the drastic data reduction performed by the experiments, the CERN DC processes on average one petabyte of data per day.


The article you linked describes an additional filtering process to get the data down from 1 petabyte a day to something more manageable. When I looked into this a while back to try to help Delbert Larson figure out how to test his theory against CERN data, the publicly available data was down to I think about 300 terabytes. I couldn't find a total data size for this latest release you linked to but if I had to guess I'd think it might be ballpark 500 terabytes available

You apparently don't need all 500 terabytes at once, they have it broken down by type of research. This is not exactly what they had when I looked into getting their data before, this is a new and improved upgrade in what they make publicly available, so thanks for posting the link as I wasn't aware their data offering had been upgraded.

One question that popped into my mind when reading the article, which cited a quote from "Laura Jeanty, ATLAS Supersymmetry working group convenor", is whether their data shows that "supersymmetry is dead", since I have seen various opinions that may be the case. But Laura apparently hasn't changed her title yet, so how dead can supersymmetry really be?


OK, you have me interested. Can you elaborate on this “Delbert Larson theory?”



posted on Jan, 15 2020 @ 02:42 PM
link   
a reply to: ErosA433

But but...what about the black holes and satanic rituals?

Anyway...thank you for a really interesting link.


Peace



posted on Jan, 15 2020 @ 04:11 PM
link   

originally posted by: ErosA433
I think the other set was LHCb... or CMS who put a data set out, this set is from ATLAS so thats really cool to be able to contrast and compare.
Agreed and that gave some credibility to the Higgs discovery, the similar results between experiments.


Oh often these titles are there for the remainder of an experiment, even if the group or activities of said group are minimal. I think the interesting thing about supersymmetry is that, its basically not really ruled out, it is just harder to explain it being there from the data we have, over the energy ranges available. The other issue is that it not being present at the energy reach of the LHC, makes some of the dependant models somewhat screwy

Here's an article on Cern's website from supersymmetry researchers saying very similar things. They haven't given up yet but have come up empty in every search so far for supersymmetry, and the windows for future searches keep getting smaller.

Broken symmetry: searches for supersymmetry at the LHC

There are still windows where supersymmetry (or some other solution to the hierarchy problem) might appear. These windows are narrowing, but if the experience of the Higgs boson is any guide, the last window is sometimes where things finally show up! In the history of particle physics, there are stories (perhaps apocryphal) of giving up too soon only to be scooped by a later experiment.



originally posted by: M4ngo
OK, you have me interested. Can you elaborate on this “Delbert Larson theory?”
Here is a prediction in Dr. Larson's preon model which can apparently be tested:

The A-B-C Preon Model
"The underlying model predicts a new phenomenon not predicted by the standard model, that is, a 69-GeV "B resonance" should be observable in Hadron and Lepton colliders"

So does the large Hadron Collider data show a 69-GeV "B resonance" or doesn't it? It's nice to have a clear means of testing a model, so if anybody is interested they can check the data for this. We know the LHC found a 125 GeV Higgs Boson so 69 GeV should be well within its capabilities.

Preon models in general try to simplify the standard model, which is somewhat complex.



posted on Jan, 16 2020 @ 01:05 AM
link   
This is what Cryptocurrency could be used for that actually benefits society instead of supporting GPU manufacturers and power companies. Take Bitcoin vs Gridcoin. Bitcoin does a lot but it does not do science. Gridcoin does. Easily CERN could make a Gridcoin project that could handle the bandwidth and processing power with enough pooled resources. If you added the financial reward bitcoin has lived in then it becomes even more rewarding.

www.reddit.com...
edit on 1/16/2020 by staple because: (no reason given)

edit on 1/16/2020 by staple because: (no reason given)



posted on Feb, 5 2020 @ 06:40 AM
link   
Thanks for this thread and for the mentions of the ABC Preon Model. I do hope to get to a study of the open data some day. However, at the moment I am putting all of my free time into my aether model, which is going quite well, even if progress is slow.

Since the closing of Arbitrageur's Ask Anything thread, I don't get here to ATS as often anymore, and hence I am tardy with this flag, star and comment.


originally posted by: M4ngo

OK, you have me interested. Can you elaborate on this “Delbert Larson theory?”


Click on this: The ABC Preon Model to see a thread on the preon theory.



new topics

top topics



 
13

log in

join