1 Introduction

In the medical domain, novel intelligent computing services usually leverage wearable and IoT devices to address various health and well-being issues [2, 38]. Nowadays, there is a wide diversity of available devices, sensors, and computer-based solutions that fall under the category of IoT. However, defining IoT is not straightforward, and multiple definitions can be found in the specialised literature [20, 62, 67]. These definitions attempt to delimit IoT based on the characteristics of its elements [67], communication typology [62], enabling aspects for applications [66] or even from a temporal perspective that considers the ratio of connected objects to the internet versus people [24]. In our proposal, we focus on the use of wearable devices, which are considered a subset of IoT devices according to Rose et al. [67].

The IoT has evolved into a robust and demanded service infrastructure for the healthcare sector, known as the Internet of Medical Things (IoMT) [65]. Similarly, the term mHealth is used to describe the utilisation of mobile devices and other wireless devices in healthcare [13, 54]. Wearable devices enable pervasive remote monitoring of patient’s parameters such as heart rate, body temperature, cardiovascular pressure or glucose levels among others. Therefore, wearables are highly suitable for implementing IoT-based solutions in the health and well-being domains [38]. Thus, the gathered data streams related to these parameters can be used to recognise complex activities such as physical exercise, sleep cycles, anxiety episodes, cardiac crises or loss of consciousness [18].

Despite the relevant features and advantages provided by wearable devices for health-based solutions, it should be noted that IoT devices in general and wearables in particular, suffer from security problems [81]. These issues have resulted in the failure of many real-world related proposals [49, 75] and impede the current and future development of promising health services based on these devices. In fact, complex analysis of wearable data streams can lead to the discovery of critical data [25, 41].

Consequently, in order to address these security problems, recent proposals explore the use of DLT to leverage its features [26, 80]. Specifically, the promise of resistance to information tampering and the resilience to single points of failure, which alleviate existing security concerns. Addressing these security challenges is crucial to enhance the comprehensive context of patient’s clinical information and other data elements included in precision medicine [37]. Thus, the combination of IoT and DLTs represents a significant advancement in medical applications such as [35, 64]: drug traceability, patient monitoring/electronic health records, and managing medical records.

Following such a trend, this paper aims to study the applicability of DLTs to a real-world wearable scenario, with a primary focus on securing the processes of collecting, storing and publishing health wearable data. Consequently, an analysis of shortcomings, limitations, potential applications and needs in the medical domain is conducted based on the specialised literature, to introduce a novel proposal for secure IoMT based on DLT and to develop a platform so-called Phonendo. Phonendo enables the pairing of wearable devices, capturing and verifying their data streams, and publishing them on a dedicated DLT infrastructure, commonly referred as DLTI, thus avoiding previous flaws.

The remaining of the contribution is structured as follows. In Sect. 2 related works are analysed to extract the key features for developing a DLT-based framework. Section 3 includes an analysis of the impact of publishing data on a DLTI, their adaptation to a wearable scenario and the potential benefits and limitations of using DLTs in the medical domain, introducing some possible applications. Section 4 presents the architecture of Phonendo and justifies its design decisions. The paper finishes with the conclusions and future work, included in Sect. 5. Appendix reviews the main concepts of IoT and DLT necessary for a clear understanding of the proposal.

2 Related works

The need for IoT solutions to benefit from certain DLT characteristics, has resulted in the integration of DLT in these solutions [7]. The basic concepts about IoT and DLT are provided in Appendix for reference, if needed.

Since Samaniego et al. proposed the idea of Blockchain as a Service (BassS) for IoT [69] dozens of researchers have applied DLT to their work [7, 57, 77]. Multiple examples can be found in the literature across various contexts. For example, in the industry, for industrial IoT solutions [11, 27], such as those focusing on supply chain [85], or food traceability [47]. Athavale et al. integrate blockchain with IoT for storing and managing data [9], and similarly, Ozyilmaz et al. take advantage of smart contracts to develop a marketplace for data obtained by IoT devices [56].

Apart from the various applications in different scenarios, most works focus on using DLT, generally blockchain, as a security mechanism for their systems [29, 60]. It is notably used for forensics [45, 48] and access control [36, 58].

In the medical domain, nowadays, more and more medical practitioners are encouraging patients to use wearable devices to collect medical records outside the hospital environment. Patients are willing to do so as they want to be involved in their diagnosis and make more informed decisions about their health [6]. However, the increasing number of wearable medical devices has raised concerns about the possibility of these devices being hacked by unauthorised individuals to access patients’ health records [4]. These are the main issues that have motivated the adoption of DLT in the medical domain.

It is necessary to highlight that most of proposed solutions are still prototypes and not yet in use. However, as DLTs mature, they will be increasingly adopted in real scenarios. Gupta el at. proposed a blockchain-based framework for telesurgery using Hyperledger Fabric (HF) [33]. Alam et al. proposed a framework that integrates blockchain and IoT to enable data sharing for remote patient monitoring, aiming to provide accurate diagnosis while reducing costs and unnecessarily hospitalisations [6]. In this case, blockchain is mainly used to foster transparency and information exchange between parties. Similarly, Amofa el at. introduce a prototype that takes advantage of Smart Contracts (SCs) to securely share personal health data [8].

More recently, Namasudra et al. proposed the use of Ethereum and SC for generating and verifying medical certificates [51], while Khan et al. store and update information obtained from a brain-computer interface in HF using SCs [42].

Regarding the storage of electronic health records, Shahnaz et al. use Ethereum to provide a secure storage for electronic health records by implementing multiple access rules via SCs [73].

To sum up, those permissionless solutions that uses Ethereum offer transparency and can benefit from SC, but have associated transaction fees. Moreover, these solutions do not scale well and transactions are confirmed with a delay of seconds.

On the other hand, permissioned approaches, mostly based on HF, scale well and transactions are confirmed quickly. However, the existence of a central authority to provide access to participants, can be seen as a lack of transparency.

To address these issues, researchers have turned to IOTA (see “Appendix”), leveraging its unique features as permissionless DLTI that offers high scalability and fee-less transactions. Several studies have explored the use of IOTA in the healthcare domain. For instance, Cisneros et al. proposed CoviReader, a decentralised healthcare management system that anonymously share user data to assist in controlling the spread of Covid-19 [21]. Abdullah et al. utilised IOTA MAM channels to ensure secure data sharing within a healthcare system [3]. Rydningen et al. highlighted the advantages and opportunities of using IOTA for health data management, while also discussing concerns such as privacy, security, or data inaccuracies [68].

Given the aforementioned considerations and recognising the benefits of IOTA, this paper introduces Phonendo, a platform that leverages IOTA as its chosen DLTI (see Sect. 4).

3 Examining the feasibility of DLT in mHealth scenarios

In this section, it is analysed the impact of publishing data on a DLTI, their adaptation to a wearable scenario, as well as the potential applications and limitations.

3.1 Analysis of the impact of a DLTI on the data register

Before designing or implementing any solution that integrates a DLTI, it is essential to conduct and analysis. This analysis becomes even more important in IoT scenarios because the suitability of DLTI depends on the characteristics of each environment (e.g. number of devices, type of data, data frequency...), being necessary to justify the convenience of using a DLTI. In our analysis, we consider the following aspects: decentralisation, confidentiality, performance, and transparency.

Regarding decentralisation, we should evaluate whether our system benefits from the absence of a central authority controlling and validating transactions. In mHealth scenarios, decentralisation may be beneficial, as it allows patient data to be validated and stored in a more secure and transparent manner, without the need for a central authority. Therefore, these systems might benefit from being decentralised.

Concerning confidentiality, we should consider whether replicating data in multiple nodes violates any restrictions. It is important to emphasise, that all data stored in a DLTI is immutable, so no sensitive data should be stored in it. Therefore, DLTIs are only suitable in mHealth, if the solution only stores anonymous data.

In terms of performance, a traditional database outperforms a DLTI in managing large volumes of data and numerous writing operations. Additionally, it is necessary to determine the maximum delay that our processes can tolerate until a piece of data is stored. For example, in emergency response systems, data must be stored immediately, while in a smart home, it might be acceptable if data are stored within a few seconds. Considering that most IoMT solutions involve sending recurrent but small data messages, DLTIs are recommended in mHealth solutions as long as the selected DLTI has high scalability and the solutions can accommodate its transaction time.Footnote 1

Regarding transparency, DLTIs cannot remove nor modify previous transaction, preserving dependencies between them, which is crucial to have traceability within the records. This allows third parties to audit the stored data, providing transparency to the process. For example, mHealth can benefit from this traceability to recognise health events from patients’ vital signs.

In summary, after considering the above aspects we conclude that DLTIs are suitable for use in mHealth scenarios. Therefore, any solution that uses data stored on a DLTI can benefit from: (i) resilience to single points of failure, (ii) resistance to alteration and/or deletion of data, (iii) protection timestamp modification, and (iv) resistance to identity spoofing [19].

Feature (i) is provided by the use of a distributed system rather than a centralised one. Hence, compromising the entire system would require attacking multiple nodes. Due to the fact that, in a distributed system where there is no central authority verifying the network status, consensus mechanisms are integrated to carry out the verification process. These mechanisms make DLTIs tamper-proof, preventing users from rewriting the ledger. This fact, provides features (ii) and (iii) to the stored data. Finally, feature (iv) is provided by the use of asymmetric cryptographic mechanisms, such as digital signatures.

From other perspective, it is also important to analyse the impact of DLTIs that support SCs, allowing the development of decentralised applications (dApps) because they enable: (i) implementation of crypto-economic models as a way to incentive users to share data (e.g. geolocation); (ii) allocation of non-fungible tokens to prove participation in the system; (iii) marketplaces for buying and selling aggregated or individual data; or (iv) Decentralised Autonomous Organisations (DAOs) [44] to regulate aspects such as adding permissions or issuing rewards to participants.

At this point, we have only focused on storing data on a DLTI ledger. However, in some cases, using a Distributed File System (DFS) such as InterPlanetary File System (IPFS)Footnote 2 could be sufficient. Rather than storing a file in a single, centralised location, it is disseminated across a distributed system of users, each holding a portion of the overall data. Nonetheless, it is necessary to highlight that a DFS might either: (i) lose some of the benefits offered by a DLTI or (ii) require additional actions to achieve equivalent characteristics.

For example, using a DFS will simplify both the generation of multiple values for a single read and information concealing to deliver a particular value. Apart from that, solutions such as IPFS do not guarantee specific value retrieval, requiring the existence of nodes with a copy of that particular value. Thus, own nodes or dedicated services must be used to prevent data loss.

3.2 Adapting to a mHealth wearable scenario

Focusing on a mHealth scenario with a prominent use of wearables, it is necessary to figure out how can we adapt the architecture to such devices and determine which DLTI is the most appropriate for storing their data. 

Adapting the architecture to wearable devices 

A platform that exposes an architecture adapted to wearables must take into consideration their characteristics. The main ones are: (i) concurrency of multiple devices, (ii) small data payloads, (iii) recurrent data capture, (iv) sensitive data, (v) easy attackability, (vi) low computational resources, (vii) different hardware specifications, and (viii) different communication protocols.

Characteristics (i), (ii), (ii), (iv) have been previously considered in Sect. 3.1 regarding decentralisation, performance and confidentiality. Thus, the architecture of wearable devices should have high scalability to allow processing multiple devices continuously sending small, but recurrent messages. Moreover, only anonymous data should be published on a DLTI. Regarding (V), security mechanisms need to be implemented to maintain system integrity and data privacy even if a given wearable is hacked. These mechanisms cannot require wearables to carry out extra computational costs, concerning (Vi). Finally, considering characteristics (Vii) and (Viii), the architecture should address the interoperability issues caused by heterogeneous hardware and communication protocols.

 Selection of the most appropriate DLTI

In our scenario, the main consideration when choosing a DLTI may focus on whether to select a permissioned or a permissionless DLTI. On the one hand, the former optimises computational resources, reduces operating costs and increases control over access to information. On the other hand, the latter allows any entity to verify the information within the ledger and build third-party applications using these data.

Therefore, we consider that a permissionless DLTI would allow a larger number of applications to be conceptualised and thus, our proposal focuses on this type of DLTIs.

Table 1 Comparison table between the DLT

Considering the different DLTI options, described in Table 1, IOTA [59] has been chosen as the DLTI for our platform. It is, a DAG-based DLTI that has the following features DLTIs [28]:

  • No commission costs on writing IOTA is feeless, so users do not have to pay to publish on the Tangle. Therefore, no tokens are required for the publication process.

  • Low latency information addiction In IOTA, there are no blocks. Any new transaction needs to validate two previous transactions in order to be appended in the ledger. This gives the infrastructure a great throughput.

  • Libraries The IOTA ecosystem provides multiple libraries that facilitate the integration of applications with the DLTI.

  • Support for SC The latest version of IOTA integrates the IOTA Smart Contract Protocol (ISCP), allowing developers to use SC in the DLTI.

To sum up, IOTA is a highly suitable DLTI for our platform in terms of efficiency and performance. Nonetheless, there are two significant issues that need to be considered. Firstly, due to its feeless nature, the ledger quickly increases in size. To address this issue, a process of instantiating a new ledger and a balance snapshot are periodically performed. Moreover, a Permanode [28] is required to retrieve historical information. Secondly, IOTA requires Proof of Work (PoW) per transaction as a SPAM prevention mechanism. Thus, if the publishing device has limited computational resources, it may delegate the PoW to a third party.

It is important to note that both problems arise from the use of IOTA’s public network and not from the use of its technology. Therefore, it would be possible to deploy a hypothetical DLTI using its technology and limit itself to storing information of interest to the domain while reducing or eliminating the PoW per transaction. However, it is easy to reason that (i) if access is not limited, this hypothetical DLTI could be easily attacked, whereas (ii) if access is limited, we are essentially converting the DLTI in a permissioned one.

3.3 Potential applications and limitations of solutions

After analysing the impact of a DLTI on the data register and the considerations needed to adapt to a wearable scenario, it is necessary to point out both potential applications and limitations.

Regarding the former, our hypothesis is that having a platform with the characteristics mentioned in Sect. 3.1 allows for the conceptualisation of applications that can benefit from the value added by this platform, especially in healthcare:

  • Validation of medical studies It is possible to guarantee that the data have not been manipulated and reflect the one captured in the given study.

  • Transparency and traceability The platform enables determining the time when a healthcare episode begins, avoiding its concealment. This information can be used to measure aspects such as the reaction time of health services or identify the source of the episode, thereby preventing negligence.

  • Health research With data stored in the platform from trusted sources, “high-quality” data can be provided for studies, where quality refers to the veracity of the available data.

  • Certification of medical conditions Controlled environments can be used to perform activities such as such as stress tests on elite athletes and certify the result to a third party.

  • Incentivising the performance of healthy activities Gamification systems can be established to incentivise healthy habits among citizens, with the involvement of governments and health practitioners.

  • Information sharing between different entities Users can share their health records with different specialists.

However, it is important to note that any of the features listed in Sect. 3.2 can be defeated by employing different attacks.Footnote 3 Therefore, it is necessary to highlight that a data-driven solution should not be built if the cost of attacking it is lower than the potential benefit.

Furthermore, any solution focused on capturing data should be reconsidered if participants have to provide data that may benefit or harm them. This is because such scenarios can motivate participants to misbehave in order to obtain benefits. This is particularly important if the data can be manipulated prior to publication.

Finally, considering the sensitiveness of data captured by wearables, solutions should be avoided if users’ privacy is at risk, even if the information can be encrypted prior to storage.Footnote 4

Note that the mentioned limitations not only apply to the proposed platform, but also to any platform with similar characteristics. Therefore, solutions should be ideally implemented in those scenarios that meet one or more of the following characteristics:

  • Individuals have no direct control over the readings of the wearable devices they use, and the wearable devices are provided to them ready to use.

  • The devices are used under the monitoring of an impartial supervisor.

  • The data captured do not cause harm to the individual using the devices, and there are no desirable or undesirable values for them.

  • The correct use of the devices is beneficial to the wearer.

4 Phonendo platform

In previous sections we analysed the importance of DLTIs for storing data from IoT devices and identified IOTA as the most suitable DLTI. We also explored the different potential applications of this infrastructure in mHealth.

Therefore, in this section, we present Phonendo, a platform consisting of several software services that manages the entire data lifecycle from wearable device data collection to publishing them on IOTA. This section describes Phonendo’s architecture, services, design considerations and data flow; which will be further detailed in the coming subsections. Phonendo’s source code is available on GitHub.Footnote 5

4.1 Phonendo’s architecture

Phonendo’s architecture follows a microservice event-driven approach and it is comprised of five components: Reader, Manager, Storage, Verifier and Publisher. Figure 1 illustrates the interconnections between all the components (see Sect. 4.2) and their main functionalities. This architecture has been designed considering flexibility, scalability and adaptability to different applications.

Fig. 1
figure 1

Phonendo architecture

Comparing Phonendo’s architecture with other proposed DLT-based architectures for healthcare, we observe that all proposals share modules for collecting and storing information. For instance, Casado-Vara and Corchado propose three layers dedicated to data collection, management, and storage [17]. Leeming et al. propose a blockchain layer and a storage layer with a blockchain-agnostic design [46]. Abdullah et al. propose an architecture focused on storage and information retrieval, distinguishing between publishers, who send information to IOTA, and fetchers, who read information from IOTA [3]. In our case, in addition to these common modules (“Reader”, “Manager”, “Storage”, and “Publisher”), we introduce the “Verifier”, which is responsible for ensuring the integrity and authenticity of the data.

The current implementation of Phonendo represents our initial efforts to validate the end-to-end functionality of the platform. Our primary objective was to demonstrate the operational feasibility of Phonendo by seamlessly integrating wearable device data.Footnote 6

For the development of Phonendo, we selected the Node.js framework and JavaScript programming language due to their suitability for rapid prototyping and their widespread adoption within the developer community. The choice of HTTP as the communication protocol was driven by the simplicity it offers in facilitating data transfer between services.

During the implementation phase, we utilised the Pine TimeFootnote 7 smartwatch as a reference device to ensure compatibility and assess the integration of Phonendo with a real-world wearable device. This allowed us to validate the functionality of the platform and its ability to handle data from smartwatches, which are commonly used in healthcare and fitness applications.

It is important to note that the current implementation of Phonendo serves as a starting point for further research and development. As we continue to refine the platform and explore additional use cases, we anticipate introducing enhancements and optimisations based on empirical experimentation and user feedback.

4.2 Phonendo’s services

In this section, the services that constitute Phonendo’s architecture are described. These services are designed to handle different aspects of the data lifecycle and enable the seamless flow of data within the platform. Below, we provide an overview of each service:

  • Reader: It manages the connection with the wearables, acting as the gateway of the system. When a new data is received, a verification process is carried out to verify the sender. If it was previously registered in Phonendo, the event is sent to Manager to start the process, otherwise, the new sender is registered in the platform.

  • Manager: It manages the life cycle of the data interconnecting all Phonendo’s services. It main responsibilities are (i) encapsulating communication and orchestrating with the rest of components to perform the business logic and (ii) certifying data and data provenance. To do so, each message is signed with a wearable’s public/private key. This key is generated for each wearable device using its MAC address and a given password, applying SHA256 algorithm [31]. This password is owned by Manager and unique on each application.

  • Storage: It enables both modelling information and controlling the state of data in the operational flow. It allows retrying any potential event that has not been sent due to an infrastructure/software failure. To improve the platform’s performance and scalability, key-value storage database engine has been used LevelDB.Footnote 8

  • Verifier: It validates the integrity of the data using multiple heuristics, generating as a result of its execution a signed message that timestamps the captured data. Those heuristics may be different depending on the scenario, but in our case, Verifier checks if values are within allowed ranges; and data timestamps to avoid old transactions.

  • Publisher: It is responsible to carry out the publication on the IOTA network [28] where the messages signed by Verifier are published. In order to allow traceability a common index has been set. In addition, each message is linked to the last messageId as the parent message, and as future improvement.

4.3 Architectural advantages of the phonendo platform

The design of Phonendo’s architecture is the result of careful considerations to meet the requirements of a robust and flexible platform for managing wearable device data. To achieve this, Phonendo adopts a microservice event-driven architecture, which offers several advantages. Let’s delve into the reasons for each component and the benefits they bring to the platform:

  • Reader: The Reader is specifically designed to support various types of wearables and establish a secure and reliable communication channel. One of the key advantages of separating the Reader functionality is its low computational requirements, allowing it to run on low-spec devices. This opens up the possibility of deploying multiple Reader instances at a low cost, both in terms of energy consumption and acquisition. Additionally, the use of low-power devices enables battery-powered operation, further enhancing the scalability and flexibility of the Phonendo platform.

  • Manager: The Manager service serves as the orchestration layer in Phonendo’s architecture. It plays a crucial role in managing system changes and addressing various challenges that may arise. By separating the Manager component, Phonendo ensures the flexibility to adapt to evolving requirements and seamlessly handle issues.

    One important aspect of the Manager is its role as the representative of the Reader components. While Readers capture data from wearable devices, it is the Manager that generates and manages the private/public key pairs for each device, as well as handles the communication process. This design ensures the Manager’s authenticity and prevents unauthorised entities from impersonating it.

  • Storage: The Storage service in Phonendo is a vital component responsible for managing data and controlling its state throughout the operational flow. By separating the storage functionality, Phonendo offers several advantages. Firstly, it enables the Manager to operate with stateless logic, allowing for streamlined data orchestration and ensuring data integrity. In addition, the Storage service provides resilience in case of Manager failures or downtime by allowing the Manager to consult the stored data and reconstruct the system’s state.

    Moreover, the separation of the Storage service opens up possibilities for further enhancements. For instance, Phonendo can explore distributing information across multiple Storage services, enabling efficient data processing and resource utilisation. Additionally, the platform can leverage the flexibility of the Storage service to accommodate different database technologies, providing the freedom to switch to alternative solutions based on specific needs and scalability requirements.

    Furthermore, the Storage service can serve multiple Managers, enabling centralised data storage and retrieval while maintaining modularity and scalability. This capability empowers Phonendo to support diverse use cases and scenarios where multiple Managers can access and interact with the same storage infrastructure.

  • Verifier: The Verifier service in Phonendo plays a critical role in ensuring data integrity and authenticity. While its primary function is to validate the received data, it goes beyond that by issuing a verifiable signature on the verified information. This signature serves as proof of the Verifier’s endorsement, adding an additional layer of trust and establishing the Verifier as a trusted authority.

    The separation of the Verifier from the Manager in Phonendo’s architecture has several motivations and potential future implications. By isolating the Verifier, Phonendo enables it to operate independently, allowing for enhanced security and trust. One possible avenue for future exploration is the publication of the Verifier’s public key on a trusted platform, such as a blockchain, along with associated metadata. This would facilitate the establishment of hierarchical trust endorsements, where higher-level Verifiers endorse the identity and verification capabilities of lower-level Verifiers.

    Furthermore, the use of advanced governance frameworks could enable the endorsement of trust from trusted entities, such as notaries or regulatory bodies, to specific Verifiers within the Phonendo ecosystem. These endorsements could enhance the overall trustworthiness and reliability of the verified data. Additionally, the exploration of revocation mechanisms could allow for the timely and secure revocation of certain data, ensuring data accuracy and accountability.

    While these ideas are still in the realm of future possibilities, they serve as motivations for the architectural decision to separate the Verifier from the Manager in Phonendo.

  • Publisher: The Publisher service in Phonendo is responsible for data publication on the IOTA network. By separating this functionality, Phonendo achieves flexibility in resource allocation and scalability. Lightweight devices like Arduino or Raspberry Pi can be utilised for the Reader components, while more powerful devices can handle the Publisher service. This design choice enables cost-effective deployment and horizontal scaling.

    Furthermore, the separation of the Publisher service opens up possibilities for future enhancements. Alternative DLTs can be seamlessly integrated into Phonendo, providing adaptability to evolving healthcare data management requirements. Additionally, exploring collaborative PoW schemes and hybrid data publication approaches can optimise efficiency and security.

The architecture of Phonendo is designed with the principles of simplicity, performance, flexibility, and scalability in mind. By separating the functionality into individual services, Phonendo provides a modular and adaptable platform that can be easily tailored to different application scenarios. This design approach empowers developers and researchers to integrate their IoT solutions with a DLTI effectively.

4.4 Phonendo’s data flow

Fig. 2
figure 2

Phonendo’s sequence diagram

This section describes the data flow carried out in Phonendo and the interaction of all its services detailed above (see Fig. 2).

  1. 1.

    Matching It is the first step to allow the connection between a wearable device and Phonendo. The connection process is managed by the Reader component. Reader performs some basic verifications associated with a minimum contract, indicating its serial number, the type of wearable device, data types, and other basic information. Finally, it is registered in the database and provides an API token to perform the rest of the operatives.

    figure e
  2. 2.

    Data reception Once a device is successfully registered it can start sending events to the system. This step involves Reader and Manager components. Reader notifies Manager when new data is received through a HTTP request. Once Manager receives data, it is involved in data processing, verification and publication.

    figure f
  3. 3.

    Data processing. Manager requests Storage to model the data to allow abstraction, obtaining as a result a JSON document. Internally, Storage stores these data and sets its status to “Captured”.

    figure g
  4. 4.

    Verification. Manager signs the data to ensure data provenance. In addition, Manager requests Verifier to verify and sign this document, resulting in a signed JSON document. Data is signed using the Verifier’s public/private key and SHA256 algorithm. This key is shared between all the instances in the deployed environment.

    figure h
  5. 5.

    Status update: verified. Manager notifies Storage the data verification, and Storage updates the status to “Verified”.

    figure i
  6. 6.

    Data publication Manager requests the publication by Publisher. This process involves data preparation, to create the IOTA message structure and send it to the Tangle. Obtaining, as a result, the confirmation of the publication in IOTA with the “messageId”.

    figure j
  7. 7.

    Status update: published. Manager notifies Storage of the publication, which performs the removal from the database to end the data lifecycle.

This data flow ensures the seamless processing and publication of wearable device data in Phonendo, providing a reliable and secure platform for managing and utilising such data.

5 Conclusions and future work

In this paper, we have introduced a novel proposal for securing IoMT based on DLT and developed a platform called Phonendo that allows for pairing wearable devices, capturing, verifying, storing their data streams and publishing them on a dedicated DLTI, thus avoiding previous flaws. Namely, Phonendo (version 1.0) has been presented describing its architecture, services and data flow. Its code and end-to-end demos are publicly available on GitHub, therefore, developers and researchers can take advantage of Phonendo to integrate their IoT solutions with a DLTI.

Currently, Phonendo is limited to the Bluetooth Low Energy (BLE), so devices with other communication protocols are not compatible. However, since Phonendo is open source and its code is available, any developer can extend the system to support new protocols and devices. It is important to note that Phonendo is a tool that researchers and practitioners can use as a starting point to develop trusted IoT systems adapted to specific scenarios.

Regarding future works, there are several directions that might be pursued. Phonendo’s services can be extended as detailed in Sect. 4.3, by exploring different storage solutions; using advanced governance frameworks to provide trust; or optimising efficiency and security through collaborative PoW schemes and hybrid data publication. Additionally, the platform can be enhanced with more functionality, focusing on aspects such as encryption, connection protocols, access regulation, query resolution or generation of analytical data. An example of such new features could be the integration of Self-Sovereign Identity (SSI), leveraging the Identity [28] framework provided by IOTA.