- This event has passed.
Swedish e-Science Academy 2015
October 14, 2015 @ 11:00 - October 15, 2015 @ 14:00| Free
Following the tradition from the three eSSENCE Academy workshops (2012 in Sigtuna, 2013 in Lund, 2014 in Umeå), we have the pleasure of inviting the community to a Swedish e-Science Academy workshop.
eSSENCE has developed state-of-the-art e-Science for five years. Many collaborative efforts between researchers have evolved. Now we will have the opportunity to meet and acquaint ourselves with the exciting research that is happening within Swedish e-Science. The programme will consist of invited talks, poster presentations, and plenty of room for networking.
OBS! Attendance is free of charge but a no-show fee of 1,000 SEK will be applied.
Programme will be updated regularly at this site.
How to get there
* By free shuttle bus: from bus stop 7 outside any terminal at Arlanda Airport, every half-hour >> Timetable
* By bus: bus 571 from Märsta Station to Arlandastad/Eurostop >> Timetable
* By car: from E4, exit 180, Märsta/Eurostop.
If you are arriving by car, register at the conference reception for free parking.
For parking over night you need a special code to the hotel garage, which you can get at the hotel reception.
Conference venue, hotel and restaurant
Access to the conference venues, the hotel and the restaurant is through the Eurostop shopping mall. Enter the mall from the bus stop or garage. A few meters away you will find the entrance to the hotel and the restaurant. The conference halls are situated at the farthest end of the mall. >> Map
Posters can be put up at registration in the Hall “Frankrike”.
The format of the poster boards is: PORTRAIT, width: 72 cm, height: 180 cm.
SWEDISH e-SCIENCE ACADEMY 2015
Keynote talk: “The Dutch e-Science Initiative” by Frank Seinstra,
Director eScience Program at The Netherlands eScience Center
Venues: Oral presentations – Hall “Europa”. Poster session – Hall “Frankrike”.
WEDNESDAY OCTOBER 14
|10:00||Registration, coffee and sandwiches
Hanging of posters (Hall “Frankrike”)
|11:00||Information for eSSENCE PIs
Ingela Nyström, Director of eSSENCE, UU
|12:30||Swedish e-Science Academy Words of Welcome
Ingela Nyström, Director of eSSENCE, UU
|12:45||Keynote talk: “The Dutch e-Science Initiative”
Frank Seinstra, Director eScience Program at The Netherlands eScience Center Abstract:
Science and society are inextricably linked. The grand societal challenges of the coming years, such as climate change, resource scarcity, and increased urbanization will require innovative scientific and technological interventions. This revolutionizing of scientific practice will depend for a large part on our capacity to harness the power of ‘big’ compute, data, and analytics technologies.
Modern data-driven and compute-intensive research requires scientists to rapidly develop ICT skills that may currently be foreign to them. For many scientists, the prospect of developing the skills needed to engage increasingly diverse and complex e-infrastructure is daunting. eScience can provide discipline focussed scientists the tools, personnel and support needed to lower this barrier and ensure developments in computing and data science are readily applied in other research areas.
To this end, the Netherlands eScience Center (NLeSC) has been set up as a collaboration between NWO, the principle Dutch scientific funding body and SURF, the Dutch higher education and research partnership for ICT. NLeSC funds and participates in multidisciplinary projects, with academia and industry, having optimized data-handling, efficient computing and big-data analytics at their core.In this presentation, I will discuss NLeSC’s goals, strategy, and approach and highlight some of the results achieved since the center’s launch in 2011.
|13:30||“Parallel Numerical Linear Algebra for Future Extreme-Scale Systems”
Bo Kågström, UmU Abstract
We give an overview of NLAFET, our recently approved EU Horizon 2020 project, which is a direct response to the demands for new mathematical and algorithmic approaches for applications on extreme scale systems as identified in the H2020-FETHPC work programme. The aim is to enable a radical improvement in the performance and scalability of a wide range of real-world applications relying on linear algebra software, by developing novel architecture-aware algorithms and software libraries, and the supporting runtime capabilities to achieve scalable performance and resilience on heterogeneous architectures. The focus is on a critical set of fundamental linear algebra operations including direct and iterative solvers for dense and sparse linear systems of equations and eigenvalue problems. The main research objectives are: (i) development of novel algorithms that expose as much parallelism as possible, exploit heterogeneity, avoid communication bottlenecks, respond to escalating fault rates, and help meet emerging power constraints; (ii) exploration of advanced scheduling strategies and runtime systems focusing on the extreme scale and strong scalability in multi/many-core and hybrid environments; (iii) design and evaluation of novel strategies and software support for both offline and online auto-tuning. The validation and dissemination of results will be done by integrating new software solutions into challenging scientific applications in materials science, power systems, study of energy solutions, and data analysis in astrophysics. The deliverables also include a sustainable set of methods and tools for cross-cutting issues such as scheduling, auto-tuning, and algorithm-based fault tolerance packaged into open source library modules.
|13:50||“Modelling materials and chemistry – in an eSSENCE and a European context”
Kersti Hermansson, UU Abstract:
A Holy Grail of scientific modelling is the ability to design, in silico, materials and molecules with enhanced properties that can ensure a sustained, and sustainable, economic and societal growth. To achieve such impact, we will need to know how to model systems of great complexity (e.g., chemical complexity) with both sufficient accuracy and sufficient speed. Moreover, widely different length and time scales may need to be considered and linked. The development of new, more reliable models and modelling workflows is therefore essential – as realised at the onset of the eSSENCE program. I will discuss efforts and successes in the development of multi-scale materials modelling (with a chemical touch) enabled by eSSENCE, and place them in a broader European perspective.
|14:10||“Data Management and Scientific Computing – upcoming challenges at and around MAX IV”
Tomas Lundqvist, Life Science Director, Max IV Laboratory, LU Abstract:
The MAX IV Laboratory is a national laboratory hosted by Lund University that operates accelerators producing X-rays of very high intensity and quality. Although a national laboratory, MAX IV will have many international users from the Nordic countries, EU and the rest of the world. The new facility will accommodate 26 beam lines when fully developed (Strategy Plan 2013-2026, https://www.maxlab.lu.se/strategy_report). Presently, 13 beam lines are funded and will become operational in 2016 and 2017. Five additional beam lines are currently being planned and funded.It is important to effectively manage and store the data generated by the visiting users while they perform their experiments in order for them to be able to directly assess, reduce and analyze their results. For many of the techniques that will be implemented at MAX IV it is vital to be able to rapidly obtain enough information to take strategic decisions during the experiment. There also needs to be enough storage capacity to allow enough time to transfer the data to the home institute. Provision of effective storage and analysis support will greatly improve the potential for quality results. MAX IV should also be able to act as a local hub for access to the necessary expertise.
|14:30||“Efficient Processing for Big Data Streams and their Context in Distributed Cyber-Physical Systems”
Marina Papatriantafilou, Chalmers Abstract
Cyber-physical systems are complex, including numerous components. They generate large volumes of events, which can be valuable means for improved, adaptive functionality in these infrastructures. This needs processing big volumes of data generated on-line, implying needs to carry out computations on-the-fly, in the streams of data, at different levels of locality in the system.The talk will provide an overview of the research and visions in the context of a set of interdisciplinary projects at the Distributed Computing and Systems group, contributing with efficient multicore and data-stream processing that can facilitate the timely extraction of information out of different sources of data at different levels of locality in big infrastructures, with examples of actual scenarios coming from Adaptive Electricity Networks, Vehicular and Communication Systems. As emphasized in several forums, efficient processing and data analysis need to be unified (cf eg article by Reed and Dongarra in CACM 2015) and this is what this research is about.Parts of the research is conducted in the projects: VR Fine-grain synchronization and memory consistency in parallel programming, EU Crisalis, EU Excess, SAFER/Chalmers-Transportation Data driven and distributed algorithms for safe and sustainable vehicular systems, GE/Chalmers-Escience/Energy EXAMINE.
|14:50||Coffee Break and Check-in|
|15:30||“Autonomous Resource Management for Robust, Efficient, and High-Performance Cloud Computing”
Erik Elmroth, UmU Abstract
By taking a holistic approach to cloud resource management, the aim is to transform today’s static and energy consuming cloud data centers into self-managed, dynamic, and dependable infrastructures, constantly delivering expected quality of service with acceptable operation costs and carbon footprint for large-scale services with varying capacity demands. The presentation will provide the birds-eye’s view of the challenge as well as some glimpses of selected completed and ongoing research efforts. These efforts address fundamental and inter-twined self-management challenges assuming that there during execution are stochastic variations in capacity need and resource availability, as well as changes in system response and operation costs. Sample challenges include how much capacity to allocate at any time for an elastic application, where to allocate that capacity, if to admit an elastic service with unknown lifetime and future capacity demands, how to optimize the various management tools’ concerted actions, etc, while taking into account the need for differentiated quality of service and the scalability requirements of the management tools themselves. For further information, see www.cloudresearch.org.
|15:50||Two accepted papers for presentations at IEEE e-Science 2015, Munich|
| “SeSE – Swedish e-Science Education – a Graduate School in e-Science”
Anders Hast, Director of SeSE, UU Abstract
Swedish eScience Education (SeSE) is a national graduate school in eScience in Sweden. It comes from the collaboration between two major research initiatives in eScience and the school has turned out to be very successful. It has made it possible for students at different universities to get access to education that is not normally available at their home universities. With SeSE they get access to education by the top experts within their respective field. We argue why such graduate school is important and how it is different from training offered by many HPC centres in Europe. Furthermore, examples of courses and their structure is discussed as well as lessons learned from SeSE and its two predecessors in Sweden.
| “e-Science in Cancer Research: Identification of Biomarkers and Signatures in Protein Data”,
Torbjörn Nordling, UU Abstract
The correct diagnosis of cancer patients conventionally depends on the pathologist’s experience and ability to distinguish cancer tissue from normal tissue under a microscope. Advances in technology for measuring the abundance of, e.g., proteins and mRNAs in tissue samples make it interesting to search for an optimal subset of these for classification of samples as cancer or normal. This search for an optimal subset of molecules is in Statistics and Machine learning known as variable selection, features selection, and subset selection.
It is typically computationally intensive and biomarker discovery benefit from an e-Science approach. In this talk, I give a brief introduction to biomarker discovery in cancer research. I discuss issues of identification of biomarkers that provide distinct signatures for prediction of tissues as cancer or normal, exemplified by a recent study of cancer signalling signatures in human colon cancer. More precisely, I discuss ranking of individual features versus combinations of features, model over-fitting, and confidence evaluation. I show that the optimal subset for separation of cancer tissues from normal tissues does not contain any of the proteins in the top quintile in terms of significant difference between the groups according to Mann-Whitney U-test or correlation to the diagnosis. I also demonstrate how Monte Carlo simulations of the separation with random class assignment can be used to calculate p-values for observing any specific separation by chance and selection of the optimal number of proteins in the subset based on these p-values. Both selection of the optimal number of biomarkers and calculation of p-values corrected for multiple hypothesis testing are essential to obtain a subset of biomarkers that yield robust predictions for clinical use.
|Poster Session and Drinks & Snacks|
|19:00||Dinner and Discussions|
THURSDAY OCTOBER 15
|08:30||“Multi-scale Modeling of Spin Dynamics”
Gunilla Kreiss, UU Abstract
We are working on how to combine atomistic and continuum models for dynamic magnetization to get a computational technique, which combines the efficiency of a continuum model with the accuracy of an atomistic model. Difficulties include very large differences in temporal and spatial scales and construction of non-reflecting interface conditions.
|08:50||“3D-Data in Cultural Heritage”
Stefan Lindgren, LU Abstract
During the last decade several new technologies and methods has become available to researchers in cultural heritage in general and in archaeology in particular. Today the use of 3d-scanning, image based modeling, ground penetrating radar, remote sensing and 3d-gis-systems is common in any archaeological survey. These methods generates much larger datasets than earlier methods and it has been, and still is, a challenge to deal with this amount of data. Lund University Humanities laboratory in collaboration with archaeologists around the world is developing methods of collecting, analyzing and visualizing 3d-data in this context.
|09:10||“Modelling and Predicting the Performance of Fusion Plasmas”
Pär Strand, Chalmers Abstract
Fusion energy research is currently in an interesting phase: JET, the world’s largest operating Tokamak, is preparing a final full fusion fuel campaign, ITER is under construction in south of France, and Wendelstein 7-X – a stellarator in Germany – is about to launch its first plasma.
Over the years, modelling and simulations have played an increasingly important role in the design and the planning of operation of fusion experiments. We will here discuss the two approaches to simulation of fusion plasmas. (1) Integrated multiscale modelling, which aims towards a full discharge model for the plasma. (2) First principles modelling of plasma turbulence, which largely describes the plasma confinement.
|09:30||“e-Science for Cancer Prevention and Control – Enabling Translational Medicine with e-Science”
Ola Spjuth, KI Abstract
New molecular technologies and computational methodology allow for an improved understanding of the mechanisms for initiation and progression of disease. eScience for Cancer Prevention and Control (eCPC) is a flagship project and joint collaboration between the Swedish e-Science initiatives SeRC and eSSENCE. The aim is to (a) support cancer biomarker discovery and use, and (b) translate discovery into individualized diagnostics, prevention and screening strategies. A modular framework for cancers of the prostate, breast and cervix has been developed, which includes prediction models for cancer initiation and progression coupled with a microsimulation to evaluate of cost and benefit of various screening scenarios. These e-Science tools have e.g. been used in the landmark STHLM3 diagnostic trial for prostate cancer screening. A system for translating next-generation sequencing to clinical diagnostics has also been developed and is now in use for early detection of mutation frequencies in chronic myelogenous leukemia (CML). The e-Science components identified and used in the eCPC projects include data integration, data security, image analysis, modeling and simulation, and automation of analysis workflows on high-performance e-Infrastructures.
|10:20||“e-Science in the Galleries – From Martian Meteorites to Mummies”
Anders Ynnerman, LiU Abstract
In the last decades imaging modalities have advanced beyond recognition and data of rapidly increasing size and quality can be captured with high speed. This talk will show how e-Science methodology can be used to provide public visitor venues, such as museums, science centers and zoos with unique interactive learning experiences. By combining large data visualization techniques with technologies such as interactive multi-touch tables and intuitive user interfaces, visitors can conduct guided browsing of large volumetric image data. The visitors then themselves become the explorers of the normally invisible interior of unique artifacts and subjects. The talk will take its starting point in the current state-of-the-art in CT and MRI scanning technology. It will then discuss the latest high-quality interactive volume rendering and multi-resolution techniques for large scale data and how they are tailored for use in public spaces. Examples will then be shown of how the inside workings of the human body, exotic animals, natural history subjects, such as the martian meteorite, or even mummies can be explored interactively, bringing e-Science enabled data exploration to the public. The recent mummy installation at the British Museum will be shown and discussed from both a curator and visitor perspective and results from a 3 month trial period in the galleries will be presented.
|10:40||“Nordic e-Science Actions”
Sverker Holmgren, Programme Director for NeGI
|11:00||Panel Discussion: “Swedish e-Science in an International Perspective”
Moderator: Kristina Edström, Dean of Research @UU
Confirmed panel members:
|12:15||Swedish e-Science Academy Closing Words
Ingela Nyström, Director of eSSENCE, UU