Introduction – AI for Digital Threads and Integrated Digital Engineering
Welcome to our new blog series – Artificial Intelligence (AI) for Digital Threads and Integrated Digital Engineering. Our goal in this new blog series is to share real, practical examples of leveraging AI/ML for digital threads and accelerating integrated digital engineering for complex cyber-physical systems.
Generative AI and availability of Large Language Models (LLMs) have made natural language processing ubiquitous. We are having conversations with ChatGPT (OpenAI) for assisting us with daily tasks, from generating code snippets and business processes, to podcasts and even math problems for our kids. Gen AI and LLMs have made it easy to extract content from large collections of semantically unstructured documents to structured data sets that are immediately useful, as well as generating new documents with insights gathered from existing datasets.
Digital threads are foundational for integrated digital engineering and digital twins, connecting models/data across disciplines and data silos into a large, curated knowledge graph. Syndeia, our digital thread platform, provides a comprehensive set of capabilities to build, navigate, query, visualize, and configuration-control live, enterprise-scale, secure digital threads. Syndeia makes it possible to rapidly build digital thread knowledge graphs from enterprise repositories, such as requirements, system architecture, hardware (PLM, MCAD/ECAD), software (ALM, SCM), simulations, verification, manufacturing, and sustainment. Figure 1 below illustrates an example digital thread connecting digital model/data repositories across disciplines in an enterprise.
DoD Instruction 5000.97 has put a renewed focus on digital threads as a foundation for integrated digital engineering. It states – “The digital thread should seamlessly advance the controlled interplay of technical data, software, information, and knowledge in the digital engineering ecosystem. Digital threads are used to connect authoritative data and orchestrate digital models and information across a system’s life cycle. The digital thread informs decision makers throughout a system’s life cycle by providing the capability to access, integrate, and transform data into actionable information.”
Intercax is a pioneer in vendor-neutral and enterprise-scale digital thread technology. The Syndeia platform turned 10 this year! . Figure 2 below illustrates the Digital Thread Explorer capability on the Syndeia Web Dashboard that makes it possible to seamlessly navigate digital thread knowledge graphs spanning enterprise data/model repositories.
The combination of AI and Digital Threads opens up exciting new technical fronts for integrated digital engineering. It is our goal to share new use cases and capabilities made possible by combining AI services and LLMs with the Syndeia digital thread platform.
Part 1 – Connecting with OpenAI as a service in Syndeia
Organizations are deploying AI services for extracting insights from large sets of unstructured data and disconnected legacy documents. Responses from an AI service can be connected with models/data from other repositories as part of a digital thread. Beyond traceability, this will ensure that results of a computation can be used for downstream activities. In this blog post, we will use OpenAI as an example AI service to demonstrate this use case. Though the steps are shown for OpenAI, this general approach will work for any AI service that provides a REST API that can respond to user prompts.
Syndeia provides a Generic RESTful Integration capability that can be used to integrate with API endpoints for any repository or service that provides a REST API, directly from the Syndeia Dashboard. OpenAI provides a REST API for its capabilities, specifically the Chat Completion capability that will demonstrated here.
The figure below show a simple workflow for integrating OpenAI as a service with Syndeia using the Generic RESTful Integration capability. The general steps are as follows.
- User adds the OpenAI service as a repository in the Syndeia Web Dashboard, and creates requests for prompts.
- Requests are triggered to OpenAPI from Syndeia, either via the Web Dashboard or API automation.
- OpenAI sends a response back to Syndeia that can be viewed in the Syndeia Web Dashboard or in automated reports.
- Syndeia wraps the API response as an artifact that can be connected to artifacts in other repositories, e.g. requirements management, PLM/ALM, verification, as part of the digital thread for a system.
Step 1 is done once, while steps 2-4 can be executed at any frequency decided by the user. Users can create collections of requests with prompts. This entire workflow can be executed from the Syndeia Dashboards, or using the Syndeia API as a part of an automated and scheduled process.
The overall steps in the workflow are presented below. Step-by-step tutorials and demos for Syndeia’s RESTful integration are available on our public documentation site.
- Tutorials: Syndeia RESTful Integration Exercises
- Demonstrations: Syndeia RESTful Integration Demos
Step 1 – Adding OpenAI as a RESTful Repository.
Add OpenAI as a new RESTful repository from the Syndeia Web Dashboard. You will need the following for OpenAI:
- API URL for OpenAI, as shown in the figure below.
- API key with your OpenAI account. Refer to the following site for creating your OpenAI API key: OpenAI Platform. Set the API key as the token with Bearer authentication.
Step 2 – Add requests to OpenAI repository.
Once OpenAI is added as a repository, users can add requests using the available API endpoints that OpenAI provides. Refer to the OpenAI API reference docs here: OpenAI Platform.
In this case, a POST request is being added to the chat completions endpoint (/chat/completions) with a body that contains the prompt, e.g. What is the mass of the James Webb Space Telescope? Users can test the request and view the response by clicking on the Test button. The response (in JSON) shows the answer returned by the OpenAI repository. Users can provide a name for the response, e.g. Mass of JWT, as shown in the figure below. In this way, users can create a collections of requests with different prompts. Specific GPT models (e.g. gpt-turbo-3.5 or gpt-4o-mini) can be specified in the request body with the prompt.
Step 3 – View requests and responses in the Syndeia Web Dashboard.
Users can view all the requests and their raw responses in the Syndeia Web Dashboard. The figure below shows a collection with 2 requests and their responses. The requests with prompts sent to OpenAI repository are highlighted in orange, while the responses are highlighted in green.
The specific request prompts and their responses received from the OpenAI repository are as follows.
- Request: What is the mass of the James Webb Telescope?
- Response: The mass of the James Webb Space Telescope (JWST) is approximately 6,200 kilograms (13,600 pounds).
- Request: What is the size of the Mars 2020 Rover?
- Response: The Mars 2020 rover, named Perseverance, has a size of about 3 meters (10 feet) in length, 2.7 meters (9 feet) in width, and 2.2 meters (7 feet) in height. It weighs about 1,025 kilograms (2,260 pounds).
Step 4 – Connect OpenAI response artifacts to digital threads.
The responses received from the OpenAI repository are available in Syndeia as artifacts that can participate in digital thread relations. Users can create digital thread relations from/to any response artifact to/from artifacts in other repositories, as part of a digital thread project. The figure below shows the Digital Thread Explorer view for the Space Telescope Digital Testbed project in the Syndeia Web Dashboard. The digital thread subset in the figure shows the Mass of JWT response artifact (OpenAI), as also shown in Figures 5 and 6 above, is connected to the Space Telescope hardware assembly (Teamcenter PLM) from which the mass was derived, and the Mass requirement (Jama Connect) that it satisfies. The color coding in the Digital Thread Explorer show the repositories owing the artifacts. OpenAI is shown as a RESTful repository in violet color.
Step 5 – Using Syndeia Cloud API to interact with OpenAI.
Syndeia is an API first technology. Anything you can do with the Syndeia Web Dashboard can be done with the Syndeia Cloud REST API, thereby making it possible to use Syndeia’s capabilities with automation and scheduling. Users can create and send requests (prompts) to OpenAPI, view results, and use resulting artifacts for downstream digital thread operations. The figure below shows the use of Syndeia Cloud REST API, specifically its Python SDK, in a Jupyter notebook. The two API calls shown here are:
- Fetching the list of OpenAI repository requests managed in Syndeia (shown as Part 3). This is the same as shown in Step 3 (Figure 6) in the Syndeia Web Dashboard.
- Sending each request to OpenAI and viewing the response artifacts in a table (show as Part 4). The Content column shows the response from OpenAI for prompts requesting the Mass of JWT and Size of the Mars Rovers.
Summary
This blog is the first in our new blog series on AI for Digital Threads and Integrated Digital Engineering. The blog specifically demonstrates a simple no-code approach for connecting an AI service with Syndeia, using OpenAI (ChatGPT) as an example. Users can add OpenAI as a RESTful repository, create collections of requests (prompts), execute requests and use responses from OpenAI as artifacts in a digital thread. The process can be automated using the Syndeia Cloud REST API.
Stay tuned for the next blog posts in this series where we will dive into more exciting use cases around AI, digital threads, and Syndeia.
Other Parts in this Series
- Part 1 – Connecting with OpenAI as a service in Syndeia (this post)
- Stay tuned for other parts.