jsimIO Automatic

Two files containing functions (DataQuery and DataProcess) and one file to be run (DataModel).

Automatic data extraction

The file DataQuery contains 4 functions:

def process_plan():
    ...
    return final_prod,task,task_type,successor,time,machine

def components():
    ...
    return component,ass_task

def buffers():
    ...
    return capacity

def machines():
    ...
    return pred,succ

Each function takes in input a query and returns in output a set of arrays.

  • final_prod: The name of the final product

  • task: The name of the process steps

  • task_type: The type (Manufacturing, Handling, Assembly, Quality Control, Loading, Unloading) of the process steps

  • successor: The successive process step

  • time: The service time of the process steps

  • machine: The station where the process steps are executed.

  • component: The name of the components

  • ass_task: The name of the process steps on which the components are assembled.

  • capacity: The capacity of the buffers.

  • pred: The name of all the physical elements of the system\

  • succ: The downstream element of each physical element.

The elements of these arrays are returned in random order. To develop each function the steps below are followed: (The code below is related to the function process_plan)

  1. Import the required libraries.

  2. Write the query as a text file. The SPARQL language has 4 main components:

    • PREFIX clause to find the documents with the given prefixes.

    • FROM and FROM NAMED clauses specify the RDF dataset to be addressed.

    • SELECT clause identifies the variables to appear in the results.

    • WHERE clause provides the basic graph pattern to match against the data graph.

  3. Import the text file of the query, convert it to a string, pass the string as SPARQL query, interrogate the endpoint and return the results.

  4. Convert the results to arrays.

  5. Convert the values from URI (Uniform Resource Identifier) format to short URIs.

    From

Automatic data processing

The file DataProcess contains the function:

The function takes in input the results from DataQuery and returns in output a set of arrays.

  • order_machine: The station where the process steps are executed.

  • order_task_type: The type (Manufacturing, Handling, Assembly, Quality Control, Loading, Unloading) of the process steps

  • order_time: The service time of the process steps

  • order_capacity: The capacity of the buffers

  • order_ass: The name of the assembly tasks

  • part: The name of the parts

  • part_type: The type (Final, Component) of the parts.

To develop the function the steps below are followed:

  1. Import the required functions and the related input arrays.

  2. Control that just one process plan and one final product exist and the correspondence between assembly tasks and components.

  3. Order the input arrays based on the precedence relations in the process plan and create the order_arrays.

  4. Order the physical elements based on their connections and control that they allow the execution of the process plan in terms of sequence of process steps.

    The above code is inserted as comment in the scripts because some of the physical connections resulting from the query Elements.txt are wrong with respect to the designed ones:

    • Element PP1.PPs_FloatingY instead of PP1

    • Elements PPH1 and PI1 are not successors of any other element

    • PP1 is downstream element of itself

    • RPP1 is downstream element of both T1 and RT1.

      If the code is run then it will return 'Error'.

  5. Define the array of the parts.

Automatic model generation

The file DataModel has to be run by the user. NB: The current version of data_model run under these assumptions:

  • Only one final product is produced, so just one process plan exists. The service time of the stations depends just on the task and not on the served class.

  • The configuration of the system is a line without parallel flows.

  • Only the first component can be processed on its own before being assembled, indeed all the other components visit a join node as first object.

  • If one station performs n tasks, then n stations are created, each one in charge of one task only.

  • If n components are joined within the same task, then (n-1) joins are created in sequence. This set of joins is positioned before the station in charge of the assembly task.

    The final product visits all the joins and, at each one, a component is assembled on it. At last the station processes the final product.

  • Join objects are visited by parts just before the related assembly station because that allows to properly consider starvation.

  • The service and interarrival times are deterministic.

To develop the code the steps below are followed: (only the second step requires the user to insert manually input data)

  1. Import the required libraries, functions and the related input arrays.

  2. Set the simulation options (optional, only if flag=1) and the interarrival time of the parts (mandatory).

  3. Verify if the process is a manufacturing or assembly one. The modelling of the two cases is carried out with two separated procedures.

  4. Create the Model instance, if flag=1 consider the options.

  5. Model the system.

The modelling procedure consists in the automatic deployment of the procedure explained in manufacturing_process and assembly_process.

Manufacturing process procedure

5.1.1) Create the network nodes and set their capacity.

5.1.2) Define the customer class.

5.1.3) Set the service times.

5.1.4) Define the routings.

Assembly process procedure

5.2.1) Create an array of component1,...,component(len(products)).

5.2.2) Define Source and Sink.

5.2.3) Define the final product class and the related fork, the component classes and the related joints.

5.2.4) Define the machines and set their capacity.

5.2.5) Set the service times. All the Manufacturing tasks before the first assembly one, are executed on the first component (created ad-hoc as 'final product_Component' in the function data_processing). The successive tasks are executed on final (the final product).

5.2.6) Define the routings. Until the first assembly task, the first component is forked from final and processed by all the required machines. If the first station executes a manufactuirng task, then the fork for the first components is connected to a machine, otherwise it is connected to the first join.

All the other components are forked by final and their routing includes only one join each because, after the first join object is visited, all the machines process final. Check if the routing is defined till the last station, if so then the last station is connected to the sink, otherwise continue.

  1. Run the simulation.

Results

A folder named "nameof_model""date"_"time" is created.

It contains an input file .jsimg for the simulation in JSIM, a file .jsim with the results of the simulation and a file .csv containing the log of the simulation.

Output

The LOG table is created as shown below. The columns indicate in order: Loggername: referred to the name of the station Timestamp Job_ID: referred to the number of the job Class_ID: referred to the name of the part type.

LOG

Among the evaluated performances, the throughput can be visualised as below:

Throughput

Last updated

Was this helpful?