WIS 2.0 Demonstration Projects

Demonstration projects are used to illustrate, evolve, validate, and/or refine the concepts, solutions, and implementation approach of WIS 2.0. They may also demonstrate some of the key benefits that WIS 2.0 will bring to the WMO community.


Discovery Metadata exchange and harvesting

Introduction

Discovery in the WMO Information System (WIS) 1.0 is comprised of WMO Core Metadata Profile, as well as Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH) for harvesting and Search/Retrieval via URL (SRU) for search.

To lower the barrier to the discovery of WIS resources, this project will test and evaluate current standards and mechanisms for metadata, search, and harvesting. Using principles of linked data, resource-oriented architecture and Representational State Transfer (REST), this project will attempt to demonstrate the ease of use for WIS 2.0 stakeholders as well as mass-market integration (web search engines, etc.).

Project description

This project aims to experiment with implementing WMO discovery metadata using the OGC API - Records draft standard (see https://ogcapi.ogc.org/records/ ). This project will also experiment with actionable linkages with demonstration project 1 (AMQP/MQTT), search/access of collections of variables of NWP data, as well as enabling search capability against WIS 2.0 topics.


EUMETNET Supplementary Observations Data-Hub (E-SOH)

Introduction

A range of user needs, particularly those associated with short-range forecasting and nowcasting, require access to higher spatial and temporal resolution observations data than are currently being exchanged. NMHS observing systems are generating some of the data that are needed but barriers prevent these from being exchanged. NMHSs resource constraints mean that further expansions to the networks to meet high-resolution needs will be very difficult, so instead, relationships are being developed with 3rd parties who can offer meteorological observations data, such as the Hydrological community and operators of low-cost Automatic Weather Stations (AWS). Mechanisms are required to collect, process, and provide access to these data according to the data policies agreed.

Project description

EUMETNET has initiated an activity under its Observations Capability Area to consider the benefits of building a ‘Supplementary Observations Data-Hub’ which would be designed to receive, process, and make available these additional observations in real-time. The focus initially will be on enabling the exchange of as much data as possible from NMHS operated land surface AWSs and gaining access to additional rain-gauge data.

This type of data hub is nothing new, with several having been deployed within EUMETNET already. What makes it a potential WIS demonstration project is the fact that it is expected to be deployed on cloud infrastructure(s) and would include functionality to make the data available in a ‘federated’ manner, using web-based services to support machine-to-machine API based interfaces. It is anticipated that the system will also generate  BUFR messages to exchange unrestricted data over the GTS when possible.

The project is currently in its initial stages, focusing on better defining the requirements of EUMETNET Members and undertaking a Scoping Study, identify existing systems and WIS2.0 compliant data standards that could be used to build the E-SOH. Consideration will be given to both centralized and distributed architectural designs.


Experimental WIS 2.0 data exchange for data in WMO CF-NetCDF profiles 

Introduction

The project aims to experiment with international real-time exchange of data using publication/subscription (pub/sub) protocols in combination with a distributed storage of the original data along with a central, mirrored repository for easy access to the complete distributed data set. Additionally, the data will be discoverable using WIS Metadata. Thus, this project also assesses the advantages, disadvantages and usability of WMO Core Metadata Profile 1.3 complemented by the exposure of this data to commercial search engines. The latter being seen as an input to the further development of WIS 2.0 Metadata and the topic structure for the pub/sub messaging. 

In this project, participating centers will make available their NetCDF data through the internet by using established technologies like HTTPS, FTP, or WebServices and provide discover, access, retrieve (DAR) Metadata in WMO Core Profile 1.3 to the Global Information System Center (GISC) Offenbach, either in XML-Format, through OAI, or through GISC Offenbach’s MetadataEditor. 

GISC Offenbach will use the Metadata to access and collect the available NetCDF data. Either by Centers actively pushing the data to GISC Offenbach or by GISC Offenbach harvesting the data from Centers through the Internet.  

Interested users will be able to request access and subscribe to the message brokers to receive notifications when new data will be available to be downloaded from the central repository.  


Exploring the use of message queuing protocols for GTS data exchange

Introduction

Weather data distribution consists of the timely distribution of both small (a few kilobytes) and large (several gigabytes) datasets. It is usually using the one-to-many data model. Since the 1990s, the protocols used for data exchange between National Meteorological Services and other operators are, as defined by WMO, FTP, and, more recently, SFTP. The use of WMO FTP is, however, becoming an issue for data exchange due to:

  • Confidentiality: files are being exchanged without encryption
  • Integrity: it is not possible to identify the data source as messages/files are not signed
  • Availability: the protocol in itself does not provide mechanisms in case of telecommunication link failures

Performance: when delivering many files especially over high latency networks, FTP is not efficient and does not make use of all available bandwidth.  The WMO FTP is used for GTS data exchange and in order to provide all GTS/WIS features. NMHSs have implemented Automated Message Switching Systems (AMSS) to satisfy the WIS-GTS requirements of message routing using FTP as the exchange protocol. The emergence of new open standard message queuing protocols provides an opportunity to improve and simplify the systems eliminating the need for routing procedures.

Project description

The project aims to experiment with an international exchange of GTS data using publication/subscription (pub/sub) protocols such as Advanced Message Queuing Protocol (AMQP) and Message Queuing Telemetry Transport (MQTT) protocols. Several Global Information System Centres (GISC), Data Collection or Production Centres (DCPC), and National Centres (NCs) will be involved in the project in order to leverage the existing national experience on the use of pub/sub solutions in the context of international collaboration.


GISC Beijing Web services catalogue projects

Introduction

WIS 2.0 Functional Architecture requires to:

  • “Maintain and expose Catalogue of services and information”, containing metadata that describes both data and the services provided to access that data, via APIs, file download, etc.
  • “Interoperate with other information systems”, particularly with the World Wide Web, ensuring the Web services can be indexed and discoverable by commercial search engines.

Project description

The project aims to design metadata for WEB services and APIs and implement a Catalogue of services as a portal website. Service providers can publish their services as service metadata records, describing APIs, data and how to access them. Each service metadata is published to the Web with accessible URLs. Service users can discover their interested services, either via the Catalogue portal or by commercial search engines. As a pilot project, several services covering GISC Beijing AoR members will be implemented and published.


GISC Tokyo cloud project

Project description

The purpose of this demonstration project is to provide data exchange functions and visualization tools on internet cloud services as a prototype for GISC Tokyo's area of responsibility (AoR) in accordance to WIS 2.0 principles.  Regarding data exchange functions on the cloud service, the use of traditional bulletin and file style will be envisaged, also to be considered is the exchange of small files such as SYNOP, TEMP, etc. For large volume files such as satellite and NWP products, the cloud service plans to offer processing services such as picking up a segmented area image, NWP viewer, etc. These services, based on Web architectures and cloud computing, will develop an understanding of WIS 2.0, and contribute to smooth migration from current the WIS framework to WIS 2.0 for GISC Tokyo's AoR users.


Global Cryosphere Watch

Introduction

The World Meteorological Organization's Global Cryosphere Watch (GCW) is a mechanism for supporting all key cryospheric in-situ and remote sensing observations, and it facilitates the provision of authoritative data, information, and analyses on the state of the cryosphere.

To achieve this, a real-time and long-time series of data and products will have to be made available to all consumers. Data and products are made by NMHSs and other operational and scientific communities. The latter two often have limited resources, relying on a variety of data management approaches, quite different from those of the WMO community. GCW is establishing a link between these communities through WIS and WIGOS. In order to successfully implement GCW, barriers between communities need to be lowered.

Project description

The GCW Data Management is a metadata-driven service-oriented approach. GCW data management is based on the FAIR guiding principles and aligns well with the WIS principles.

It follows a metadata-driven approach where datasets are documented by standardized discovery metadata that are exchanged through standardized Web services. The GCW Data Portal can interface with scientific and other data providers with WMO-specific interfaces like real-time exchange through WMO GTS. For all other purposes, the Internet is used as a communication network. A critical component of the discovery metadata exchanged is the application of a standardized semantic annotation of data and interfaces, for example using ontologies as well as linkages between datasets and additional information useful to fully understand a dataset (e.g. WIGOS information).


Interconnection of GISC Casablanca to the National Meteorological Centres within its area of responsibility

Introduction

Telecommunications have always been one of the main technical constraints limiting the global exchange of meteorological and climatic information. The implementation of the WIS has been compromised mainly by the inability of NMHSs, particularly those located in LDCs and SIDS, to liaise with their principal GISC. The design of WIS 2.0 attempted to remedy this situation by encouraging the use of new telecommunications technologies and by adopting a web-based architecture.

Project description

Two Global Information System Centres (GISC) have been designated for the Regional Association I (RA I): GISC Casablanca and GISC Pretoria. The African continent has the distinction of covering a vast geographical area and a large number of Members with varied economic potential and socio-cultural specificities.

The area of responsibility (AoR) of GISC Casablanca, for example, includes more than 37 WIS National Centres (NCs) and more than 8 WIS Data Collection or Production Centres (DCPC). For several years, it has been difficult to set up direct links between these national centres and their principal GISC, despite the efforts made in terms of awareness-raising and capacity-building.

The project aims to promote the use of the Internet as a support for the exchange of data between GISC Casablanca and the NCs and DCPCs within its AoR, given the difficulties encountered while trying to implement peer-to-peer links or point-to-point internet VPN.

Once the pilot project is completed, NCs and DCPCs in RA I will have open access to GTS data using secure protocols for the data transfer.


Open Access to the GTS (Open-GTS)

Introduction

Currently, oceanographic and marine meteorological data is distributed globally to forecast centres and other operational data users. This is done using the WMO Global Telecommunication System (GTS). The availability of real-time data is crucial for forecasters, emergency managers, and other scientific purposes. The format of choice for distribution on the GTS is BUFR, which is a table-driven binary format that has very rarely been utilized in research oceanography. This format requirement becomes an imposing barrier for distributing and accessing data via the GTS.

The goal of the Open Access to the GTS (Open-GTS) project is to develop and implement improved methodologies for the distribution and access of near-real-time ocean and marine meteorological data through the GTS. As the Open-GTS project workflow embraces several of the WIS 2.0 principles, it seems highly compatible as a WIS 2.0 demonstration project.

The project has been developed and supported by the GOOS Observations Coordination Group (OCG).

Project description

As mentioned, there are two facets to the Open-GTS Project: data distribution and data access. For data distribution, the Open-GTS project leverages an open-source tool called ERDDAP to connect data producers with National Data Centres (NDC). The benefit of this connection is that data producers can work in the data formats they are most used to, with the main requirement being that the data is served by ERDDAP and contains sufficient metadata. The NDC will then harvest the data in whatever format they prefer, encode that data into BUFR using standard templates, and distribute those messages via the GTS.

For accessing data from the GTS, the workflow is reversed. NDC’s will harvest the data from the GTS using their connection to the WIS, decode the BUFR messages, save the data in a consumable format and load the data into ERDDAP. It is also possible to push the data into ERDDAP directly as part of the decoding process, thereby skipping the need to create intermediate files. Once in ERDDAP, data consumers are free to access and use the data in a variety of formats, ensuring that they will be able to use the data of the clients they are most familiar with, without being burdened by the cumbersome process of reformatting BUFR data. It is worth noting that the ERDDAP data platform also supports federation, thereby creating easy access to distributed datasets as well. This capability ensures that data consumers who may not have a direct connection to the WIS would still be able to access and use the near-real-time data.


WMO Hydrology Observing System

Introduction

In hydrology, the numerous and varied activities and applications have led to widespread heterogeneity in resources and procedures, which has hindered cooperation among the different actors and stakeholders. The goal of the WHOS is to fully implement the concepts of: Water Data Catalogue by federating heterogeneous data providers and publishing a number of interfaces in support of advanced data discovery and access; Water Data as a service by a broker mediating data formats and services in support of interoperability between data providers and users; Enriching Water Data by appropriate hydrological attributes which make data discoverable according to a plethora of operational and scientific purposes; and Community for Water Data and Tools by removing the digital divide in hydrology with the aim of building a community interacting with the offered architectural functionalities.

Project description

WHOS can be defined as a collection of components that work together to store, index, access and distribute hydrological information. WHOS is built around seven fundamental components: (1) data, (2) format, (3) service, (4) mediator, (5) broker, (6) ontology, and (7) client. WHOS defines a new paradigm in hydrology: a “datagram” reshaping specific data exchange using all these seven components.

The WHOS components enable the linkage of service providers with service consumers. In particular, the catalogue services facilitate the discovery of data sources, while the mediator and broker components facilitate the connection to data services for its request and access. When data providers are harvested by WHOS, data services are included in the registry. When data users search for available services of interest, WHOS improves data discovery and access with advanced interoperability, where every single component serves an important role in the designed architecture. Accordingly, data providers and users play the primary role in the exchange of information and in the development of a distributed knowledge base having large volumes of hydrological data in a federated architecture for e-monitoring.

 

~ page last updated: 1 April 2021 ~