In order to enable an iCal export link, your account needs to have an API key created. This key enables other applications to access data from within Indico even when you are neither using nor logged into the Indico system yourself with the link provided. Once created, you can manage your key at any time by going to 'My Profile' and looking under the tab entitled 'HTTP API'. Further information about HTTP API keys can be found in the Indico documentation.
Additionally to having an API key associated with your account, exporting private event information requires the usage of a persistent signature. This enables API URLs which do not expire after a few minutes so while the setting is active, anyone in possession of the link provided can access the information. Due to this, it is extremely important that you keep these links private and for your use only. If you think someone else may have acquired access to a link using this key in the future, you must immediately create a new key pair on the 'My Profile' page under the 'HTTP API' and update the iCalendar links afterwards.
Permanent link for public information only:
Permanent link for all public and protected information:
SIB Swiss Institute of Bioinformatics
Quartier Sorge, Genopode building
(Vital-IT), Michele De Lorenzi
Over the last 10 years, the emergence and steady improvements of high-throughput technologies in the life science & biomedical fields generated massive amounts of data, providing an unprecedented data-based description of a human individual. The data explosion is expected to last in time, with a growth of 2 to 40 exabytes of genetics and genomics data produced in the next 10 years. The data life cycle management, which comprises the data handling, processing and preservation, of such rich personalised data sets is still challenging.
Besides, fostering interdisciplinary collaborations across mutually "untrusted" third parties (e.g., HPC computing centers, hospitals and industry) while preserving individual data privacy is a topic of friction and intense (computational) development.
In this forum, we would like to discuss the current state of the art of biomedical and other sensitive data, the inherent costs for compliance and standardization and how to preserve data privacy while expert intensive computational analyses are required within a HPC computing environment.
How to avoid public leaks of sensitive data, protecting by the same time your reputation and that of your institution ?
What are the best practices in sensitive data management ?
What are the best practices in preserving patient data ?
What are the HPC tools and technology to foster multi-disciplinary collaborations across several partners while preserving patient data security ?
How can we efficiently manage cost for compliance and standardization ?
How to make sense of very large biomedical data by harmonizing very disparate data sets from heterogeneous data source by respecting international institutional regulations regarding patients data security for the aim of Big Data analytics ?
Derek Heinrich Feichtinger
Diana Coman Schmid
Eur Ing Fotis Georgatos
Guy Mael Horclois
Hon Wai Wan
Jean Louis Raisaro
Michele De Lorenzi
Coffee and registration
Welcome and introduction
(Vital-IT), Michele De Lorenzi
Keynote Presentation. Data Protection for Personalized Health
In this talk, we will describe the challenges of data protection in personalized health and discuss possible solutions. We will also explain how this issue will be addressed in the framework of the Swiss Personalized Health Network.
Keynote Presentation. Data&Computing Services for Personalized Health: a Paradigm Shift
The talk will introduce the Swiss Personalized Health Ecosystem (SPHN/PHRT projects and beyond) and give an user and usage perspective for handling sensitive biomedical data in secure computing environments. The special context is setup by the patient data, which has high legal and ethical requirements but also high and challenging computing demands.
To offer top class services for Personalized Health Research, secure and powerful IT infrastructures for data storage, computing and sharing are a must. This is instrumental but not sufficient. What we also need in this diverse and dynamic ecosystem are innovative teams with hybrid expertise (for example, medicine, bioinformatics, IT). With the user experience as a central focus, the challenges are on finding the right balance between security and usability.
Christian Bolliger will further extend on how Leonhard Med - a secure computing and data platform for Personalized Health - is addressing the requirements and challenges for Personalized Health data handling.
Diana Coman Schmid
Building a Secure HPC System - Technical and Regulatory Aspects
LeonhardMed is a HPC platform designed for the needs of medical research or more specifically for data analytics and AI in medical research. This opens quite a few challenges and constraints.
The system must fulfil the legal and ethical requirements and give the researchers the tools they need and they are accustomed to. It must be possible to exchange data with similar sites and hospitals while keeping the privacy of patients data.
Last but not least the platform most be adaptable to future requirements while the system administration should be possible in an efficient way.
Lunch and networking
This presentation will focus on security awareness on management of sensitive data in HPC environment.
(ETH Zurich / CSCS)
Keynote Presentation. A Confederation Approach to HPC in Switzerland : Example of Vital-IT Over the Last 15 Years Dedicated to Life Science
This presentation will highlight the challenges in maintaining a competence center aligned with the necessity of a very diverse and evolving environment.
Examples ranging from biomedical to basic science will serve as a thread in the presentation to shed light on the necessity to coordinate and have means to exchange amongst competence centres in Switzerland and beyond.
Federating Clinical Data for Biomedical Research
The protection of clinical data is of paramount importance but this can hinder its use in research due to ethical and legal constraints.
This presentation will cover the construction of a federated database containing detailed clinical information that is amenable to remote analysis through a central server. The particularity of this system is that it allows complex analyses to be performed without any individual-level data being moved or copied from their original location.
The technical challenges, lessons learned, limitations and opportunities gained by such an approach will be discussed.
Streaming Genomes with MPEG-G the New ISO Standard for Genomics Data?
This presentation introduces the essential features of MPEG-G, the emerging ISO standard for genome sequencing data compression, storage and transport. MPEG-G, is an ISO standardization initiative addressing the problems and limitations currently faced by applications relying on commonly used legacy formats.
The standard is currently in its final development stage and it is planned to be published in early 2019. The objective is to provide compression and transport technologies yielding efficient, versatile and economical handling of sequencing data and associated information. Beside efficient compression and native transport capabilities, notable features include data streaming, selective access to compressed data, data aggregation, file annotation, conversion to legacy formats, enforcement of privacy rules, selective encryption of sequencing data and metadata. These features make MPEG-G the technology base for supporting the interoperability of many complex use cases and associated implementations.
Panel Discussion: Security Considerations in Cloud and Using Bare Metal HPC Technologies
Cloud and bare metal HPC technologies for computing, storage and networking are rapidly evolving and converging.
In this panel, a group of experts and technical leaders would discuss key challenges and opportunities for the community of HPC services providers, i.e. hpc-ch. In particular, discussion will focus on the diverse nature of data sources and their management and control within a data centre environment.
- Christian Bolliger (ETH Zurich)
- Guy-Maël Horclois (CSCS)
- Daniele Passerone (Empa)
- Heinz Stockinger (SIB Swiss Institute of Bioinformatics)