top of page
Elegant Abstract Background

Digital Twin Technology

A Creator's Tool

dt intro.jpg

What is DT & Why?

Digital twin technology promises to transform the design and give engineers, manufacturers and businesses a 360-degree view of products and systems throughout the entire lifecycle.

 

Armed with an enriched pool of data provided by the Internet of Things (IoT), the design technology stands poised to deliver previously impossible opportunities. But, there’s a catch: A key capability is missing from conventional modelling and simulation tools.

​

For the technology to deliver on its promise, it must be able to run analytics in real-time or faster, provide a high degree of prediction accuracy and integrate data from a collection of disparate and often incompatible sources.

Digital Twin Technology - Applications 

Sign me up!

Roles & responsibilities 

As a Technical Lead, I undertake activities like gathering requirements from varied Digital Twin creators, research on information architecture, User research, Low-fi wireframes, Hi-fi Wireframes, Prototyping, Testing and front end development (in Unity), provide backend team with technical details.

Communication Tower

Product's Goal

Achieve Development of fully extensible, cross-functional, collaborative (Creator tool) to forge Digital Twins. Aspects like version controlling, Importing ontologies (varied formats) from different sources, CRUD operations of DTDL (language) from UI, Publishing the finished contents to the Azure backend with ease. It is sophisticated software accompanied by a straightforward user experience that promotes the creation of digital twins through 3D representation.

Research& Methodlogy

Understanding User Needs
Personas
User Pain points

Ingestion of source files with Microsoft's standard DTDL (Digital Twin Definition Language) received from different sources in varied formats. The process starts with automatic validation of the imported data before conversion. A simplified UI helps ameliorate the JSON files to guarantee data quality, focused analysis of individual elements within an asset to study the properties of the data syntax, and employment of backend technology of the Azure to publish Digital Twin on the cloud. The tool provides capabilities for authentication and approval before releasing the digital twin.  

​

Ingestion of source files with Microsoft's standard DTDL (Digital Twin Definition Language) received from the IoT sensors to control data configuration. The process starts with automatic validation of the imported data before conversion. A simplified UI helps ameliorate the Jason files to guarantee data quality, focused analysis of individual elements within an asset to study the properties of the data syntax, and employment of backend technology of the Azure to publish Digital Twin on the cloud. The tool provides capabilities for authentication and approval before releasing the digital twin

​Based on the discussion and interviews conducted with varied Digital Twin creators, to import and stitch multiple Ontologies functionality is a challenging task and ability to import Ontologies from multiple sources. It is crucial to have a structured approval process and diversified personas (based on the organizational hierarchy).

Persona 4 - Reader

It is read-only access, restrained from CRUD operations, and assisted with notifications of the incorporated changes.

Persona 3 - Contributor

The contributor is confined to read/write/modify/update (basic CRUD) operations, dispatches these updates. The dashboard displays the approval notifications.

Persona 1 - Owner

Owner persona would have complete access to the resources. The role involves the right to delegate access, restrict users to approve and publish the final DTDL files to the cloud. The owner dashboard displays updates and navigation to relevant projects.

Persona 2 - Admin

Admin functionalities include adding/removing, approve and publishing content in the designated project. The admin dashboard displays summary of the assigned project

01

Converting all ontologies to DTDL

02

Cloud Save of all Projects

03

Hard to work with typical DTDL JSON

04

Very complex to keep multiple versions.

Inference and Implementation

The Most important feature that DT creators needed was Importing ontologies from different sources in varied formats, Publish the created contents (DTDL models) to Azure Digital Twin Instance with ease, Ability to keep Different DTDL versions, Compare the local files against the cloud files, Simplify reading DTDL files through UI. 

​

The items that are implemented in product design are, Create/Edit DTDL files using UI, Publish the DTDL contents to Azure with single button press (Owner and Admin persona), Contributor persona will be able to send updated/modified files for approval, Created a comparison view to compare files, Created a cloud import button and created a list view to see all different ontology from different sources, Convert all standard ontology formats to DTDL, Keep all versions of DTDL models

Affinity Map

Findings

Node based DTDL editing is good.

Customizable UI

Easier navigation & Intuitive tools

Appealing  visual design

Observation

Good Visualization of DTDL

Need based tool options

Create Azure resources in editor

UI based approach to Edit DTDL

Confusion

Confusion - model graphs & nodes

More info on query language

More info on Azure resources

Can try diffeent theme  options 

Information Architecture

bottom of page