Two cases currently being discussed:
How could the experience of lab-based scientists be enhanced by making multiple sources of real time data (eg. IoT-enabled ambient condition monitoring, status of running experiments) available to them on demand. We had a demo set up by Matt Krammer showing how a personalized dashboard could be accessed by researchers in the lab, using a bluetooth beacon frequently applied in proximity marketing. The participants discussed the idea with reference to three major components of the case , data sources, data integration & data consumption.
How could laboratory infrastructure be improved with a focus on instruments in the lab. Today, there is a 1 to 1 ratio between most instruments and computers through which those instruments are controlled. Could we reduce this ratio by controlling multiple instruments via a generic controller & thereby generate more space for actual experimentation in the lab? Additionally, as we move to a vision of the lab of the future, (towards which we have taken the first steps by storing data in the cloud), Can we also perform the data transformation & analysis that currently relies on softwares installed in lab-based computers, in the cloud environment? Along the way, how do we approach issues such as latency, modularization, security & scalability.