Skip to content

Commit

Permalink
Merge pull request #7 from apdn7/feature/4.2.0
Browse files Browse the repository at this point in the history
Feature/4.2.0
  • Loading branch information
apdn7 authored Apr 25, 2023
2 parents 93508b7 + 1bf8d0e commit dc4ad90
Show file tree
Hide file tree
Showing 186 changed files with 6,997 additions and 2,258 deletions.
7 changes: 7 additions & 0 deletions LICENSE.md
Original file line number Diff line number Diff line change
Expand Up @@ -711,6 +711,13 @@ or FITNESS FOR A PARTICULAR PURPOSE. See the license files for details.

> licensed under the <a href="http://opensource.org/licenses/MIT">MIT License</a>
### js/shepherd.min.js

* Copyright (c) 2021
* Shepherd is maintained by Ship Shape

> licensed under the <a href="http://opensource.org/licenses/MIT">MIT License</a>
### sigmajs/build/sigma.min.js
---------------------

Expand Down
4 changes: 3 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -159,7 +159,9 @@ Data is stored as TSV file under the subdirectories in [/sample_data](/sample_da
* 20220311.tsv
* /COG_sample_data: Occurence of machine alarms
* alarm_every_15minutes.tsv
* alarm_daily
* alarm_daily.tsv
* /AgP_sample_data: PartNo, Ok/NG, NG_Mode, and pressure data
* AgP_sample.tsv

Above data will be automatically imported after activation.
You can call each sample visualization from 'Load' or 'Bookmark' on upper right corner of GUI.
Expand Down
20 changes: 20 additions & 0 deletions RELEASE.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,25 @@
# Releases

## v4.2.0

New features and improvements

* Added "Aggregation Plot(AgP)" page.
* This includes stacked bar charts and line charts with aggregated data.
* For example, you can visualize stacked bar charts of production volume, number of defects along with line chart of aggregated sensor data, and explore their relationships.
* Added sample data (/sample_data/AgP_sample_data) and a sample bookmark (10-1 AgP).
* (StP/RLP/ScP) Enabled the data finder.
* (FPP/StP/MSP) Changed the sampling logic of kernel density estimation, which is activated when the number of data points is large.
* Changed from random sampling to equidistant sampling method to preserve minimum, maximum, and median values
* (PCA) In the T2/Q contribution plot, more character strings are displayed in the item names of the bar graphs to be displayed, and the appearance is adjusted.

Bugfixes

* Fixed a bug where an error occurred when importing data from a CSV/TSV file with no column name and the data could not be read (skipping columns with no column name)
* Fixed incorrect week number displayed in calendar picker and Data finder
* (RLP) Fixed a bug that overlapped variable names when the variable name is long.
* (SkD/PCA) Fixed a bug where the original "column name" registered in the data source was displayed instead of the "display name".

## v4.1.2

New features and improvements
Expand Down
2 changes: 1 addition & 1 deletion VERSION
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
v4.1.2.178.79921487
v4.2.0.181.3f08083a
1
OSS

Expand Down
25 changes: 13 additions & 12 deletions about/Endroll.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,9 +58,9 @@
- [Early Bird Collaborators](#early-bird-collaborators)
- [Research & Development Team](#research--development-team)
- [Data Analysis Education, Research & Promotion](#data-analysis-education-research--promotion)
- [Data Analysis Package](#data-analysis-package)
- [Data Analysis Interface](#data-analysis-interface)
- [Data Analysis Platform](#data-analysis-platform)
- [Analysis Package](#analysis-package)
- [Analysis Interface](#analysis-interface)
- [Analysis Platform](#analysis-platform)
- [© F-IoT DAC Team & Rainbow7 + Bridge7](#-f-iot-dac-team--rainbow7--bridge7)

<!-- /TOC -->
Expand Down Expand Up @@ -142,15 +142,15 @@
|Data Analysis Education Development & Management||Takero Arakawa 荒川 毅郎 DNJP Monozukuri DX Promotion Div.|
<br>

## Data Analysis Package
## Analysis Package

||||
|--:|:-:|:--|
|Data Analysis Package Development & Management Leader||Genta Kikuchi 菊池 元太 DNJP Monozukuri DX Promotion Div.|
|Data Analysis Package Development||Sho Takahashi 髙橋 翔 DNJP Monozukuri DX Promotion Div.|
|Analysis Package Development & Management Leader||Genta Kikuchi 菊池 元太 DNJP Monozukuri DX Promotion Div.|
|Analysis Package Development||Sho Takahashi 髙橋 翔 DNJP Monozukuri DX Promotion Div.|
<br>

## Data Analysis Interface
## Analysis Interface

||||
|--:|:-:|:--|
Expand All @@ -159,21 +159,22 @@
|||Pham Minh Hoang ファム ミン ホアン Phạm Minh Hoàng FPT Software Japan|
|||Tran Thi Kim Tuyen チャン ティ キム トゥエン Trần Thị Kim Tuyền FPT Software Japan|
|||Nguyen Huu Tuan グエン フー トゥアン Nguyễn Hữu Tuấn FPT Software Japan|
|||Duong Quoc Khanh ズオン クォック カイン Dương Quốc Khánh FPT Software Japan|
|||Nguyen Van Hoai グエン ヴァン ホアイ Nguyễn Văn Hoài|
|Technology Leader of Rainbow7||Masato Yasuda 安田 真人 DNJP Monozukuri DX Promotion Div.|
|Agile Master of Rainbow7 & Bridge7||Yasutomo Kawashima 川島 恭朋 DNJP Monozukuri DX Promotion Div.|
<br>

## Data Analysis Platform
## Analysis Platform

||||
|--:|:-:|:--|
|Data Analysis Platform Product Owner FY20-||Tatsunori Kojo 古城 達則 DNJP Monozukuri DX Promotion Div.|
|Technology Leader of Data Analysis & Data Analysis Platform Product Owner FY19||Genta Kikuchi 菊池 元太 DNJP Monozukuri DX Promotion Div.|
|Data Analysis Platform Development||Takero Arakawa 荒川 毅郎 DNJP Monozukuri DX Promotion Div.|
|Analysis Platform Product Owner FY20-||Tatsunori Kojo 古城 達則 DNJP Monozukuri DX Promotion Div.|
|Technology Leader of Data Analysis & Analysis Platform Product Owner FY19||Genta Kikuchi 菊池 元太 DNJP Monozukuri DX Promotion Div.|
|Analysis Platform Development||Takero Arakawa 荒川 毅郎 DNJP Monozukuri DX Promotion Div.|
|||Sho Takahashi 髙橋 翔 DNJP Monozukuri DX Promotion Div.|
|Supervisor & Technology Leader of Data Analysis & SQC||Mutsumi Yoshino 吉野 睦 DNJP Monozukuri DX Promotion Div.|
|Supervisor & Senior Manager||Toshikuni Shinohara 篠原 壽邦 DNJP Monozukuri DX Promotion Div.|
|Supervisor & Senior Manager||Toshikuni Shinohara 篠原 壽邦|
<br>
<br>
<br>
Expand Down
7 changes: 6 additions & 1 deletion ap/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
from ap.common.constants import FlaskGKey, SQLITE_CONFIG_DIR, PARTITION_NUMBER, UNIVERSAL_DB_FILE, APP_DB_FILE, \
TESTING, YAML_CONFIG_VERSION, YAML_CONFIG_BASIC, YAML_CONFIG_DB, YAML_CONFIG_PROC, YAML_CONFIG_AP, \
INIT_APP_DB_FILE, INIT_BASIC_CFG_FILE, REQUEST_THREAD_ID, YAML_START_UP, LOG_LEVEL, AP_LOG_LEVEL, AppGroup, \
AppSource
AppSource, appENV
from ap.common.logger import log_execution
from ap.common.services.request_time_out_handler import RequestTimeOutAPI, set_request_g_dict
from ap.common.trace_data_log import get_log_attr, TraceErrKey
Expand Down Expand Up @@ -118,6 +118,7 @@ def create_app(object_name=None):
from .sankey_plot import create_module as sankey_create_module
from .co_occurrence import create_module as co_occurrence_create_module
from .multiple_scatter_plot import create_module as multiple_scatter_create_module
from .aggregate_plot import create_module as agp_create_module
from .common.logger import bind_user_info
from .script.convert_user_setting import convert_user_setting_url
from .script.migrate_csv_datatype import migrate_csv_datatype
Expand Down Expand Up @@ -202,6 +203,7 @@ def create_app(object_name=None):
co_occurrence_create_module(app)
multiple_scatter_create_module(app)
tile_interface_create_module(app)
agp_create_module(app)
app.add_url_rule('/', endpoint='tile_interface.tile_interface')

basic_config_yaml = BasicConfigYaml(dic_yaml_config_file[YAML_CONFIG_BASIC])
Expand Down Expand Up @@ -272,6 +274,8 @@ def before_request_callback():
# get the last time user request
global dic_request_info



# get api request thread id
thread_id = request.form.get(REQUEST_THREAD_ID, None)
set_request_g_dict(thread_id)
Expand All @@ -294,6 +298,7 @@ def before_request_callback():
"url": "https://www.google.com/chrome/"
})


@app.after_request
def after_request_callback(response: Response):
if 'event-stream' in str(request.accept_mimetypes):
Expand Down
4 changes: 4 additions & 0 deletions ap/aggregate_plot/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@

def create_module(app, **kwargs):
from .controllers import agp_blueprint
app.register_blueprint(agp_blueprint)
19 changes: 19 additions & 0 deletions ap/aggregate_plot/controllers.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
import os

from flask import Blueprint, render_template

from ap.common.services.form_env import get_common_config_data

agp_blueprint = Blueprint(
'agp',
__name__,
template_folder=os.path.join('..', 'templates', 'aggregate_plot'),
static_folder=os.path.join('..', 'static', 'aggregate_plot'),
url_prefix='/ap'
)


@agp_blueprint.route('/agp')
def index():
output_dict = get_common_config_data()
return render_template("aggregate_plot.html", **output_dict)
Empty file.
Empty file.
2 changes: 2 additions & 0 deletions ap/api/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ def create_module(app, **kwargs):
from .heatmap.controllers import api_heatmap_blueprint
from .parallel_plot.controllers import api_paracords_blueprint
from .common.controlllers import api_common_blueprint
from .aggregate_plot.controllers import api_agp_blueprint
app.register_blueprint(api_setting_module_blueprint)
app.register_blueprint(api_trace_data_blueprint)
app.register_blueprint(api_table_viewer_blueprint)
Expand All @@ -25,3 +26,4 @@ def create_module(app, **kwargs):
app.register_blueprint(api_heatmap_blueprint)
app.register_blueprint(api_paracords_blueprint)
app.register_blueprint(api_common_blueprint)
app.register_blueprint(api_agp_blueprint)
Empty file.
96 changes: 96 additions & 0 deletions ap/api/aggregate_plot/controllers.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
import timeit
from copy import deepcopy

import pytz
import simplejson
from dateutil import tz
from flask import Blueprint, request

from ap.api.aggregate_plot.services import gen_agp_data
from ap.api.categorical_plot.services import customize_dict_param
from ap.api.trace_data.services.csv_export import to_csv
from ap.common.constants import COMMON, ARRAY_FORMVAL, END_PROC, CLIENT_TIMEZONE
from ap.common.services import http_content
from ap.common.services.csv_content import zip_file_to_response
from ap.common.services.form_env import parse_multi_filter_into_one, get_end_procs_param, \
update_data_from_multiple_dic_params, parse_request_params
from ap.common.services.import_export_config_n_data import get_dic_form_from_debug_info, \
set_export_dataset_id_to_dic_param
from ap.common.trace_data_log import save_input_data_to_file, EventType, save_draw_graph_trace, trace_log_params

api_agp_blueprint = Blueprint(
'api_agp',
__name__,
url_prefix='/ap/api/agp'
)


@api_agp_blueprint.route('/plot', methods=['POST'])
def generate_agp():
""" [summary]
Returns:
[type] -- [description]
"""
start = timeit.default_timer()
dic_form = request.form.to_dict(flat=False)
# save dic_form to pickle (for future debug)
save_input_data_to_file(dic_form, EventType.AGP)

dic_param = parse_multi_filter_into_one(dic_form)

# check if we run debug mode (import mode)
dic_param = get_dic_form_from_debug_info(dic_param)

customize_dict_param(dic_param)
org_dic_param = deepcopy(dic_param)
dic_params = get_end_procs_param(dic_param)

for single_dic_param in dic_params:
agp_data, *_ = gen_agp_data(single_dic_param)
org_dic_param = update_data_from_multiple_dic_params(org_dic_param, agp_data)

stop = timeit.default_timer()
org_dic_param['backend_time'] = stop - start

# export mode ( output for export mode )
set_export_dataset_id_to_dic_param(org_dic_param)

org_dic_param['dataset_id'] = save_draw_graph_trace(vals=trace_log_params(EventType.CHM))

return simplejson.dumps(org_dic_param, ensure_ascii=False, default=http_content.json_serial, ignore_nan=True)


@api_agp_blueprint.route('/data_export/<export_type>', methods=['GET'])
def data_export(export_type):
"""csv export
Returns:
[type] -- [description]
"""
dic_form = parse_request_params(request)
dic_param = parse_multi_filter_into_one(dic_form)

# check if we run debug mode (import mode)
dic_param = get_dic_form_from_debug_info(dic_param)

customize_dict_param(dic_param)
dic_params = get_end_procs_param(dic_param)

delimiter = ',' if export_type == 'csv' else '\t'

agp_dataset = []
csv_list_name = []
for single_dic_param in dic_params:
agp_dat, agp_df, graph_param, dic_proc_cfgs = gen_agp_data(single_dic_param)
end_proc_id = int(agp_dat[ARRAY_FORMVAL][0][END_PROC])
proc_name = dic_proc_cfgs[end_proc_id].name
csv_list_name.append('{}.{}'.format(proc_name, export_type))

client_timezone = agp_dat[COMMON].get(CLIENT_TIMEZONE)
client_timezone = pytz.timezone(client_timezone) if client_timezone else tz.tzlocal()
csv_df = to_csv(agp_df, dic_proc_cfgs, graph_param, client_timezone=client_timezone, delimiter=delimiter)
agp_dataset.append(csv_df)

response = zip_file_to_response(agp_dataset, csv_list_name)
response.charset = "utf-8-sig"
return response
Loading

0 comments on commit dc4ad90

Please sign in to comment.