K.Explorer instalation requirements

K.Explorer requires the Domain Analysis Inspector component to be ran on the Enterprise source code. This process is essential for K.Explorer to work effectively and to provide accurate analysis of the Enterpise code.

The setup process involves the following steps:

· Installation The Domain Analysis Inspector component is installed on the client’s system. This can be done either manually or through an automated installer.

· Configuration The client must provide the necessary configuration settings: Source code specific files and any additional settings required for the analysis.

· Analysis and Knowledge Base Creation The Domain Analysis Inspector component scans the source code to create a corresponding knowledge base for each domain. This analysis involves using a variety of algorithms and techniques such as code parsing, pattern recognition, slicing, machine learning, etc, as stated above. This knowledge base is used by the K.Explorer tool to provide code snippets based on natural language queries as well as descriptive ontology diagrams. All this processing occurs on the customer servers and only metadata, encrypted models and encrypted snippets are uploaded to the tool internal repository. See next section.

· Node Definition Finally, the output of the analysis phase is used to define a corresponding node in the cloud environment structure, for each domain analysed.

Once the setup is completed, K.Explorer can be effectively deployed and take advantage of the client’s source code and context.

K.Explorer ensures that the privacy of the source code is maintained, by one of the following alternatives:

    a) the analysis is performed on the client’s facilities and not on external servers. (preferred since it fully guarantees the privacy of source code)

    b) the analysis is performed on Morphis-Tech servers, following the practices described in the Security section.


In a cloud environment, all K.Explorer modules and nodes are created using Kubernetes.

Kubernetes is an open-source container orchestration platform that is used for managing containerized applications and services. In the context of the architecture elements described earlier for the K.Explorer tool, Kubernetes is used as a container orchestration solution to manage the various components of the tool that run on cloud servers.

Kubernetes provides several benefits for the cloud architecture of the K.Explorer tool, including:

· Scalability – Kubernetes allows the K.Explorer tool to scale up or down based on demand. This means that the tool can handle a large volume of source code analysis requests without sacrificing performance.

· Resilience – Kubernetes provides features such as automatic failover and self-healing, which ensure that the K.Explorer tool remains available and functional even in the event of component failures or server outages.

· Resource Optimization – Kubernetes can optimize resource usage by automatically scaling containers based on resource utilization and demand, which helps to reduce costs and improve efficiency.

By leveraging the benefits of Kubernetes, K.Explorer can provide reliable, scalable, and efficient source code analysis services to its users.

The physical architecture is composed by a cluster with a hierarchical network of namespaces, providing the services corresponding to the logical components described on the Architecture section. It allows for the isolation and management of resources within namespaces, while providing a secure and scalable way to manage external traffic into the cluster through gateway nodes and ingress controllers.

The architecture provides scalability and availability of the cloud environment, designed to be scalable and able to handle a large number of requests. It is also designed to be highly available, with redundancy built into the system to ensure that the tool remains available even in the event of a node failure.


In an on-premises deployment option, the client may use Docker to replicate the cloud structure on their own machines. This deployment option provides the client with the ability to run the K.Explorer tool locally within their own environment, rather than relying on external cloud servers.

The deployment process for the on-premises option involves the following steps:

· Docker Setup

The client sets up Docker on their own servers to ensure compatibility with K.Explorer and Kubernetes configuration

· Cloud Structure Replication

The client then uses Docker to replicate K.Explorer’s cloud structure on their own machines. This involves creating a local Docker container with the same nodes of the cloud structure.

· Source Code Analysis

Once the cloud structure has been replicated, the Domain Analysis Inspector component is run locally on the client’s infrastructure to perform source code analysis and create corresponding knowledge bases, as described in the Setup section. This step is the same for any deployment option.

This on-premises deployment option is similar to the cloud SaaS version in that it involves the same components and processes, but is implemented on the client’s server infrastructure, using Docker. This provides a secure and flexible deployment option that takes advantage of the K.Explorer benefits while maintaining control over their own environment. It ensures that sensitive source code remains on-premises rather than being stored on external cloud servers.


Code Generation Engine (CodeGen) component plays a special role because it uses generative models for producing the output. There are two options for accessing the Code Generation Engine component:

· It can be implemented by using external resources and models, such as the OpenAI GPT API or other existing public generative model APIs connected with K.Explorer.

· It can be implemented by using the internal K.Explorer model, which requires extra computational power in the form of GPUs with large memory sizes.

In the on-premises option, when using the internal K.Explorer model, the server hosting that service is located inside the local installation. The needed computational power is outlined in the Hardware Requirements section.


To enforce that the source code is never visible or exposed, K.Explorer provides the following implementation measures:

1. Encryption – All source code snippets that are referenced in the Knowledge Base are encrypted before they are stored.

2. Access Controls & Data Isolation – Access controls can be implemented to ensure that only authorized users have access to the source code. The tool is designed to isolate the analysed code and the knowledge base associated with each domain to ensure that they are only accessible to authorized users.

3. Data Deletion – Once the analysis is complete, the source code files are deleted from the tool’s system to ensure that they are not retained beyond their intended use. Only the encrypted versions will remain in the KB storage.

4. Secure Network – The K.Explorer tool is hosted on a secure network that is protected by firewalls and intrusion detection systems.

By implementing these measures, K.Explorer ensures that the source code is never visible or exposed to unauthorized users. K.Explorer complies with data privacy and security regulations, which are essential when handling sensitive information such as Enterprise source code.

The encryption approach for the source code is as follows:

The client’s source code is stored in an encrypted format in the database. This is achieved by using one of the available and commonly used data encryption algorithms before storing it in the database.

– K.Explorer’s server nodes do not process the code. They use the associated metadata that was created during the analysis process. They only transport the code from the storage to the web-app / browser.

All communication between the client’s browser and the web app should be encrypted using SSL/TLS

The code should be decrypted only when it is needed for displaying to the user. The decryption process may occur at server-side (java webapp) or at client-side (javascript browser)

Client-side decryption is performed by javascript libraries. It can add complexity to the application and may require additional client-side resources.

Server-side decryption is done by java libraries and requires the encryption keys to be stored on the server.

Client-side encryption can provide stronger security and more control over the encryption and decryption process but can be more complex to implement. Server-side encryption can be easier to implement and maintain but may not provide the same level of security and control as client-side encryption. The choice between client-side and server-side encryption depends on the security requirements defined and the tool configuration used. In a full on-premises configuration, the server-side decryption maybe the best option, for example.


The hardware requirements depend on several factors related to the size and complexity of the application to be analysed.

This makes it challenging to be precise on the hardware requirements, therefore this should be considered as the minimum requirements to execute K.Explorer environment, but our architecture is designed to be scalable as needed.

Contact Sales

Only for enterprises.