JRSSEM 2022, Vol. 02 No. 4, 617 631
E-ISSN: 2807 - 6311, P-ISSN: 2807 - 6494
DOI: 10.36418/jrssem.v2i04.328 https://jrssem.publikasiindonesia.id/index.php/jrssem
DEVELOPMENT OF BUSINESS INTELLIGENCE FOR
INTEGRATION OF FINANCIAL REPORTING SYSTEM IN PT
XYZ
Dea Carissa
Bina Nusantara University Jakarta, Indonesia
*
e-mail: deacarissa@gmail.com
*Correspondence: deacarissa@gmail.com
Submitted
: 09
th
November 2022
Revised:
19
th
November 2022
Accepted:
28
th
November 2022
Abstract: The research aims to determine the relationship between two or more variables. Data
collection with Literature and field research with questionnaires. The result of the research that is
more influential in BAPPEDA Pemprov DKI Jakarta is the Business Intelligence variable, so it needs
to be prioritized and given special attention to its implementation in the field. The research method
used is associative research, designing Data Warehouse architecture, planning Data Warehouse
sources, dimensional data modeling, filling in Data Warehouse. And the results of this study can be
used as a reference for further follow-up with the DKI Jakarta Provincial Government BAPPEDA to
be prioritized in the near future. The conclusion from the results of this study is that a transaction
data warehouse designed using PDI (Pentaho Data Integration), is very helpful in collecting
transaction data from original transaction data that already has a database and only as OLTP
(Online Transactional Processing) data so that can be used as data OLAP. (Online Analysis
Processing) so that it is analyzed using OLAP.
Keywords: Business Intelligence; Financial Reporting System; PT XYZ.
Dea Carissa | 618
INTRODUCTION
Business Intelligence is a new
technology for understanding the past and
predicting the future. The technology
intended here is one capable of collecting,
storing, accessing and analyzing data to
help decision makers come up with better
decisions. Business Intelligence is a data-
driven decision support system. Various
advantages in the application of Business
Intelligence, namely to collect, store,
analyze and provide access to data to help
users make accurate decisions by carrying
out various activities including, decision
support systems, querying, reporting,
OnLine Analytical Processing (OLAP),
statistical analysis, forecasting, and data
mining for data analysis. Figure 1.1 below
shows that the idea of Business Intelligence
has evolved over the past forty years and
will continue. It can be seen that Business
Intelligence focuses on data mining and
knowledge discovery. This is an important
aspect of Business Intelligence (Siswono
2013).
Figure 1. Development of BI Technology and its applications (Chang et al., 2006)
Present at the beginning of 2019, PT
XYZ which is a subsidiary of XXX group
introduced PT XYZ as a technology-based
express delivery company that is ready to
be the first choice for consumers in
meeting the needs of sending goods.
Realizing its existence in the digital era, PT
XYZ implements the latest technology
development in every service offered. Not
only that, determined as a trusted and
reliable partner to connect the entire
Indonesian market, PT XYZ implements an
integrated logistics system supported by
integrated transportation infrastructure.
To support the company's business
processes, PT XYZ uses NETSUITE as an
Enterprise Resource Planning (ERP)
application to record company costs and
619 | Development of Business Intelligence For Integration of Financial Reporting System In
PT XYZ
revenues and an E-bill application to
manage corporate revenue-related
transactions. In the case of PT. XYZ,
currently processes transaction data and
visualize the data to become a report is still
done manually using the Microsoft Excel
application. Currently, the data entered by
ERP is only in the form of transaction data
reports containing all company
transactions in a certain period. PT XYZ
requires transaction data reports to assist in
planning and controlling costs,
determining the cost of goods shipped and
making management decisions. Based on
the results of an interview with the
management of PT XYZ, there are several
problems related to the process, including:
1. The data obtained from the output of
the E-bill application mentioned earlier
is still raw data and takes 1-2 weeks for
the process of processing data into
information because it is done
manually.
2. Technology that is still limited uses
Microsoft Office as a tool to process
data while companies have many large
data sources.
3. The speed and accuracy of processing
data has not been maximized because
the data has not been fully integrated
between departments.
4. Decision making is less accurate and
fast because it is still not supported by
a tool that supports decision making.
PT XYZ needs a BI application that provides
monitoring information related to total
order reports and revenue estimates to
improve management effectiveness and
efficiency in strategic decision making.
Table 1. Processes Running Now in PT XYZ1
STEPS
TIME REQUIRED
CONSTRAINTS
Downloading
Transaction Data from
E-Bill
3-5 days
The more customers and
more transactions, the
longer it takes to download
data
Transaction Data unified
3-5 days
The more files you
download, the more time it
takes to unify all files in 1
(one) large file
Process Transaction
Data using Microsoft
Excel formulas
2-3 days
files sometimes not
responding or even force
close by themselves
Uploading processed
Transaction Data into
NETSUITE
1-2 days
Upload data one at a time
Presenting transaction
data that has been
processed in the form of
graphs
1-2 days
files sometimes not
responding or even force
close by themselves
Dea Carissa | 620
Figure 2. Processes Running Now in PT XYZ2
Therefore, BI application development
can be a solution to assist companies in
monitoring business processes, especially
related to company financial performance.
Development includes the design and
implementation of BI solutions ranging
from architecture, data warehouses, ETL
processes and visualization in the form of
dashboards. The reason BI is used as a
solution is because BI can be used by many
organizations to compile ERP information,
and other data repositories for quick and
effective decision making (LUFTY
ABDILLAH 2020). And also because the
purpose of BI is to provide information
about the business at that time to
managers and allow them to make
decisions that can solve a problem or take
an opportunity.
The methodology used in this study
uses the BI roadmap approach (Moss and
Atre 2003). The BI roadmap approach is
used because it is agile and adaptive to
changes in BI product building and is fully
aimed at supporting BI development. In BI
technology, there is a data integration
process: Extract, Transform, Load (ETL)
which can help the process of converting
the Online Transaction Processing (OLTP)
system into Online Analytical Processing
(OLAP).
MATERIALS AND METHODS
The research method used is
associative research, which is research that
aims to find out the relationship between
two or more variables. Data collection with
Literature and field research with
questionnaires. The results of more
influential research in the BAPPEDA of the
DKI Jakarta Provincial Government are
variable Business Intelligence so it needs to
be prioritized and paid special attention to
its implementation in the field. And the
results of this study can be used as a
reference for further follow-up by
prioritizing by the DKI Jakarta Provincial
Government BAPPEDA in the near future.
The research methods used include
designing a Data Warehouse architecture,
planning a Data Warehouse source,
dimensional data modeling, and filling in
the Data Warehouse. This research
concluded that the development of the
621 | Development of Business Intelligence For Integration of Financial Reporting System In
PT XYZ
Data Warehouse model at the Bisa Sarana
Informatika academy was designed
according to the needs to support strategic
decision making. The Data Warehouse
model that is built can provide strategic
information that can support the academic
evaluation and planning process in the
academic field, with the help of Business
Intelligence, Data Warehouse information
can be presented with various dimensions
according to needs.
RESULTS AND DISCUSSION
Identification Data Marts
The second process is data mart
identification, where data marts are a
subset of the Data Warehouse, usually
consisting of a single subject area. Then it
can be analyzed grouping by data mart,
according to the user and the needs of the
reporting report as needed.
Table 2. Identification data March
Data Mart
Characteristics
Source Data Mart
Data Mart 1
Transactional Data
ps_shipment_ar
Data Mart 2
Invoicing Data
tx_payment_invoice
Data Mart 3
Settlement Transaction from External
tx_payment_fsi
Data Mart 4
Reconciliation of Internal and External Settlement Data
ps_collection
Non-Functional Requirement
The fourth process is Non-Functional
Requirement, where this process is more
about the security, performance, availability
and maintenance of the system that has
been created. Things that are the basis of
security include User Group Creation,
Access Rights Settings, Audit Trail.
Performance Recalculation of related
reports after the adjustment manual can be
completed under 10 minutes for each
reporting. Availability System can process
BI data on a schedule, BI data is available
and can be accessed by users no later than
08:00 am. Maintenance in the form of BI
data dackup needs to be done periodically
properly.
User Requirement
The fifth process is User Requirement,
BI reporting applications must have access
control and user roles. The function of the
user role is to be able to manage all
functions in the application, Can manage
reports such as pulling back the data
needed such as create update deletes, can
check the results of reports made by user
makers and send reports to management.
Harmony reporting apps must have access
controls and a user role. Here it is divided
into three parts, namely:
Dea Carissa | 622
Table 3. User Requirements
The functions of the user role are:
1. Admin: Can manage all functions on
the application.
2. Maker: Can manage reports, such as
retracting required data, such as
create update delete.
3. Approver: Can check the results of
reports made by user makers and
send reports to the Authority.
System Requirement
The sixth process is System
Requirement, which is a specification that
defines the functions that must be
possessed by the information system to be
built such as infrastructure specifications.
The system used is a cloud database with
the following details.
Figure 3. System Requirements
System Design
The seventh process is System Design,
this stage is a stage of system development
that defines functional needs, preparation
for implementation design, and describes
how a system is formed which can be in the
form of depiction, planning and sketching
or arrangement of several separate
elements into a whole and functioning,
including configuring the component's
software and hardware of a system. There
are five stages that are carried out based on
system design drawings, namely:
1. Data Source
At this stage the data source used is a
Database on PostgreSQL Server,
Action
Admin
Maker
Approver
Create User Group
X
Create User
X
Create Parameter
X
X
Parameter Aprover
X
X
Create Data
X
X
Adjustment Data
X
X
Report Aprover
X
X
Send Report
X
X
623 | Development of Business Intelligence For Integration of Financial Reporting System In
PT XYZ
Database PT XYZ (SQL Server) and
Excel User
2. Data Preparation
At this stage preparing the data to be
pulled from the source, then loaded
into DWH, this is included in the ETL
(Extract Transform Load) process. The
BI tools used at this stage are SSIS.
3. Data Storage
At this stage, prepare a Data
Warehouse that is stored in a cloud
database and filled in by data extracted
from the data source and mapped
according to the results of the analysis.
4. Data Analysis
At this stage is a very important stage,
because at this stage researchers make
a cube which later this cube can do
analysis, calculation and linking data.
The BI tools used at this stage are SSAS
(SQL Server Anaysis Services).
5. Data Access
This stage is the finalization stage
where the data that has been
processed from stage 1 to stage 4 will
be presented in the form of reports
and applications. The tools used are
SSMS (SQL Server Management
Studio), namely Harmony.
Development of a Prototype
Prototype is one of the system life cycle
methods that is based on the concept of a
working model. The goal is to develop the
model into a final system. The stages of the
Prototyping Method include:
1. Needs analysis
At this stage the developer identifies
the software and all the system needs
to be created.
2. Build prototyping
Build prototyping by creating
temporary designs that focus on
presenting to the user (e.g. by
creating inputs and output formats).
Figure 4. Prototype Intelligence Dashboard
The Intelligence dashboard
consists of a bar chart and a
table. The bar chart depicts the
number of numbers according
to the table that is on it.
3. Evaluation of prototyping
This evaluation is carried out to
find out whether prototyping is
in accordance with user
Dea Carissa | 624
expectations.
4. Encoding the system
At this stage the approved
prototyping will be converted
into a programming language.
5. Test the system
At this stage, it is done to test
software systems that have
been tested by creating SIT
(System Integration Test)
documentation
6. System Evaluation
Software that is ready to be
finished will be evaluated by the
user to find out whether the
system is as expected by
creating UAT (User Acceptance
Test) documentation
7. Using the system
Software that has already been
tested and approved by
customers is ready to use.
Implementation and Control
a. Development of Data Warehouses and
Data marts
After the data mart is formed, then
determine the process of developing a
data warehouse according to the
needs of the data mart, then here will
determine the dimensions with related
facts in the formation of the data
warehouse in accordance with the data
mart.
1. Dimensional Modeling
Dimensional modeling is the
process of forming dimensions
and facts and star schema,
where the formation of star
schema is in accordance with
the Vercellis method. At this
stage, it is the stage of
determining the dimensions
related to the facts that have
been adjusted by the needs of
the data mart according to
business requirements, so that
there is no data redundancy in
the dimension.
Figure 5. Entity Relationship Diagram
2. Physical design.
Physical design consists of 2
categories, namely the physical
design dimension and the
design dimension, which is the
process of forming metadata in
the database, by detailing all the
attributes of each dimension
625 | Development of Business Intelligence For Integration of Financial Reporting System In
PT XYZ
and facts that have been
designed in the previous stage.
And also determine the
database used for data
integration on related
dimensions and facts, so the
need to adjust the data to the
database to be used is to pull
the staging data, after doing
data staging is to analyze the
data into dimensions and facts.
After the Fact and Dimension
analysis for the database is
carried out, the query process is
carried out during the Data
Warehouse processing process
in the SQL Server Integration
Services application and the
formation of queries for each of
the Staging and Data
Warehouse packages according
to needs.
3. System Architecture
The system of architecture
emphasized at this stage is the
emphasis of a conceptual model
that defines the structure,
behavior, and views of more
than a system. A system
architecture may consist of
system components, visible
external properties of
component components, and
relationships (e.g. behavior)
between them. It can provide a
plan from which the product can
be obtained, and a developed
system, which will work together
to implement the system as a
whole. In the system
architecture, there are 4 stages
of the system that will run and
will be integrated with each
other. The first is the source
system stage consisting of
several parts including DAD and
EXCEL Parameters. The second
stage of entering the data
source is entered in the
database. The third stage is the
Extract Transform Load (ETL)
process and the fourth stage is
the formation of a data
warehouse
System The architecture of
forming a data warehouse is to take
from several sources in order to
form the required data warehouse, it
can be described as follows:
1. In the first stage, the data
source used from several
sources, namely:
a. Order Management
System (MySQL Server),
b. SGS (MySQL Server),
c. FVP (MySQL Server),
d. EXCEL Parameters
(Excel) Data from
payment gateway
e. EXCEL Parameter (Excel)
Data from user upload
2. In the second stage, the data
source is entered in a staging
database on a server that is
different from the source
with the MySQL Server
engine which holds all the
source data originally.
3. In the third stage, the data
staging process is carried
out to extract transform load
(ETL) to avoid dirty data,
Dea Carissa | 626
namely duplicate data, and
redundancy so data
cleansing needs to be
carried out.
4. At the fourth stage is the
formation of the data
warehouse.
b. Development of ETL Tools
The main focus of this phase is to
develop procedures for validating
data that has been extracted and
moving data in the form of a Data
Warehouse. The development of
ETL tools here uses the SSIS (SQL
Server Integration Services)
application, the function of SSIS is a
tool used to perform the Extract,
Transform, and Load (ETL) process
and is classified as a Business
Intelligence feature. In relation to
Business Intelligence, SSIS is a
feature used to pull data from ERP,
relational databases, or files for the
results to be then saved into the
Data Warehouse. While ETL is a
process to collect data from various
sources (Extract), clean it
(Transform), to then save it into
another system (Load). The
development of the system as a
whole is an actual implementation
of the analysis and design carried
out. In this phase of the project,
researchers designed data
warehouses (fact tables and
dimensions) and ETLs. In addition to
the SSIS application at this stage, it
also uses the SSAS (SQL Server
Analysis Service) application, which
is one of the Business Intelligence
tools. Analysis Services is a
technology for OLAP (Online
Analytical Processing) and Data
Mining. OLAP administration
procedures are carried out in SQL
Server Management Studio
including Viewing data and creating
multidimensional expressions.
OLAP used is Drill down and
Pivot (rotate). Drill down represents
the data in more detail while pivots
visualize operations that rotate the
axes of the data as an alternative in
the presentation of the data.
c. Development of Metadata
Metadata describes the content,
quality, conditions, and other
characteristics of a piece of data
written in a standard format. In this
case, the reporting metadata format
standard used by PT XYZ refers to
the standard metadata provided by
the management. Metadata itself
has several functions, including
identifying data, grouping similar
data, distinguishing data according
to certain criteria and providing
important information related to
data.
d. Development of Applications
The method to verify data in Data
Warehouse is to prepare reports in
Analysis Services SQL Server
Management Studio or use a BI
application and compare it with
reporting data managed by users
which is later reported to
management.
627 | Development of Business Intelligence For Integration of Financial Reporting System In
PT XYZ
At this stage, the scheduler
process for pulling the data
warehouse is carried out along with
the schedule carried out to
generate data in the application
sourced from the data warehouse
every day at 00.30.
Process Flow is a step or stage
in submitting a report from the
beginning to the end.
Figure 6. Process Flow Data Reporting (DINA IKRAMINA SETIANI 2020)
There are five stages in the data
reporting flow process, namely:
1. Datamart
In the data mart stage,
where the data displayed is
a display of data source that
has been generated by a
scheduler that matches the
selected date period.
2. Manual Adjustment
In the manual adjustment
stage, it is a process of
displaying and processing
data sources that have been
adjusted according to the
date period. In the manual
adjustment there are Create,
Edit, Delete and Import
features.
3. Validation
In the validation stage, it is a
process of displaying and
processing data according
to reports and periods for
validation. In validation
there are features Create,
Edit, Delete, Import (import
data from excel), Validate
(used to validate data),
Rollback (to move data to
the initial stage), Retrieve
(used to replace data
according to the column
selected from the previous
data).
4. Approval
At the approval stage,
display data according to
the report and the period for
the approval process. In
approval, there are features
Approve (to approve data
will be processed to the next
stage), Reject (to reject data
in the process to the next
stage), and Rollback (to
Dea Carissa | 628
return data back to the initial
stage).
5. Submission
In the Submission stage,
displaying data in
accordance with the report
and data period that has
been approved for
processing and is ready to
be presented to
management. In submission
there are Submit features (to
process data into CSV form),
Download (to display the
finished list created),
Rollback (return data back to
the initial stage).
To access the features above, there
is a role (access rights) for each user.
This means that each user level has
different access rights, depending
on the authority of the user given.
The following are the role access
rights to the available users:
Tabel 4. Role Process Flow
Feature
Role (Hak Akses)
Datamart
Maker
Manual Adjustment
Maker
Validation
Maker
Approval
Approver
Submission
Approver
e. Release and Testing
The main purpose of system
testing is to ensure that the
resulting system is in accordance
with the previously specified
requirements. Testing this system is
very important, therefore a process
documentation is made to record all
activities that have been carried out
during the Business Intelligence
application development process
including the integration of
reporting systems. This
documentation can be in the form
of a User Manual Book, Business
Requirement Definition (BRD),
Functional Specification Document
(FSD), Integration Testing System
(SIT), User Acceptance Testing (UAT).
Later this documentation can be
used as a guide for users and teams
involved in this project and those
who are not involved in this project.
After making documentation, a
validation process is carried out
involving related users in this
reporting project to check and
provide the purpose that the
information system has been
implemented correctly and in
accordance with user needs and
intended use.
The following is an example of
a report that has been tailored to
the needs of the user:
Dea Carissa | 629
DOI: 10.36418/jrssem.v2i04.328 https://jrssem.publikasiindonesia.id/index.php/jrssem
Figure 7. Reports that have been tailored to user needs
Result Discussion
From the discussion above, it can be
obtained about the efficiency and
effectiveness of reporting from the
previous process where for internal parties
PT XYZ downloads data in several files then
combines it into a file. Then the file is
processed in order to get the necessary
information. This process runs 1-2 weeks.
With the Harmony application, the report
process at PT XYZ becomes shorter and
simpler. The data displayed in Harmony in
real time, there is no need to process data,
so that the input process to NETSUITE is
more effective and efficient and level
management can monitor regularly
because it has implemented Business
Intelligence development.
With the Harmony application, the
reporting process conveyed by PT XYZ to
the management level already uses a portal
that is integrated with metadata. Harmony
is one of the reporting features that is
integrated directly into the management
level, so information can be received faster
and will make the reporting process faster
and more accurate.
CONCLUSIONS
It can be concluded that the development
of Business Intelligence at PT XYZ is as
follows:
1. The conclusion of the results of this
study is that the transaction data
warehouse designed using PDI
(Pentaho Data Integration), is very
helpful in collecting transaction data
from the original transaction data that
already has a database and only as
OLTP (Online Transactional Processing)
data so that it can be used as OLAP
(Online Analysis Processing) data so
that it is analyzed using OLAP.
2. In addition, in this study for transaction
data reporting designed using PRD
(Pentaho report Designer), it is very
helpful in making transaction reports
and can be adjusted to the needs, so
that problems that often occur in terms
of dependence on report acquisition
are expected not to occur again.
3. The transaction dashboard, designed
using Pentaho CDE (Community
Dashboard Editor), is very helpful for
leaders in analyzing data to study the
trends of transactions carried out in
XYZ companies, and can be used to
Dea Carissa | 630
support decision making and can also
be a measure of company
performance.
REFERENCES
Arifin, Zainal, and Aris Sugiharto. 2013.
"Design a University Business
Intelligence System as a Support for
Academic Decision Making." Journal
of Business Information Systems 01
01: 3040.
Chang, Elizabeth ; Dillon, Tharam ; Hussain,
Farookh K. 2006. Trust and
Reputation for Service-Oriented
Environments: Technologies for
Building Business Intelligence and
Consumer Confidence. Trust and
Reputation for Service-Oriented
Environments: Technologies for
Building Business Intelligence and
Consumer Confidence (April): 1349.
Connolly, Thomas, and Carolyn Begg. 2014.
Pearson Database Systems A Practical
Approach to Design Implementation
and Management 6th Global Edition.
www.pearsonglobaleditions.com/con
nolly.
DINA IKRAMINA SETIANI. 2020.
"DEVELOPMENT OF BUSINESS
INTELLIGENCE FOR THE INTEGRATION
OF REPORTING SYSTEMS IN PT. BANK
MEGA TBK."
Dawn, Muhammad Ibn, and Arief Rahman.
2017. "Implementation of Indonesia
National Single Window (Insw) With
Business Intelligence System (Bus)
Approach (Empirical Study at the
Directorate General of Customs and
Excise)." 21(1).
Haryono, Kholid et al. 2018. "Optimization
of Enterprise Data to Improve
Information Quality Using Business
Intelligence." : 3945.
Henderi, Sri Rahayu, and Bangun Mukti
Prasetyo. 2012. "Dashboard
Information System Based on Key
Performace Indicator." National
Seminar on Informatics
2012(semnasIF 2012) 2012(semnasIF):
8287.
Inmon, W. H. 2002. Building The Data
Warehouse. New York, NY: John Wiley
& Sons.
Karyono, Giat, Ema Utami, and Emha Lutfi
Taufiq. 2011. "Development of Data
Warehouse And On-Line Analytical
Processing (OLAP) For Information
Discovery And Data Analysis." Journal
of Telematics 4(2): 1328.
Kimball, Ralph, and Margy Ross. 2013. The
Data Warehouse Toolkit: The Definitive
Guide to Dimensional Modeling. 3rd
ed. John Wiley & Sons, Inc.
———. 2016. The Kimball Group Reader :
Relentlessly Practical Tools for Data
Warehousing and Business
Intelligence. 2nd ed. John Wiley &
Sons, Inc.
LUFTY ABDILLAH. 2020. APPLICATION OF
BUSINESS INTELLIGENCE TO
SUPPORT FINANCIAL STATEMENTS
BASED ON COST ACCOUNTING CASE
STUDY OF PT XYZ. USA.
Miranda, Eka. 2008. "Business Intelligence
Development for Enterprise Business
Development." CommIT
(Communication and Information
Technology) Journal 2 (2): 111.
Moss, Larissa T., and Shaku Atre. 2003.
“Business Intelligence Roadmap: The
Complete Project Lifecycle for
631 | Development of Business Intelligence For Integration of Financial Reporting System In
PT XYZ
Decision- Support Applications.
Communication: 576.
Mulyana JRP. 2014. Yogyakarta Pentaho:
Open Source Solution for Building
Data Warehouses. Yogyakarta: Andi
Offset.
Nur, Zaky, and Imam Mukhlash. 2014.
"Implementation of Business
Intelligence in XYZ Bank Report
Management." Journal of Science and
Senni Pomits 3(Intelligence Business):
1621.
PT XYZ. “IM_XYZ_QMS_002.
https://anteraja.id/.
Ramadan, Hasnur, and Agus Soepriadi.
2011. "Application Of The Inmon And
Kimball Combination Model To The
Development Of Enterprise Data
Warehouse And Business Intelligence
(EDW/BI) Case Study of Enterprise
Data Warehouse / Business
Intelligence (EDW/BI) Development In
National Multi Finance Companies."
JOURNAL OF ICT LANTERNS 2011: 17
18.
Ramesh et al. 2018. “Business Intelligence,
Analytics, And Data Science : A
Managerial Perspective.
Santika, Reva Ragam. 2012. "Development
Of A Data Warehouse Model To
Support Strategic Decision Making In
The Academic Field Of Student Affairs:
A Case Study Of The Bina Sarana
Academy." XIV(2): 15971.
Siswono, Siswono. 2013. "The Role of
Business Intelligence in Business
Solutions." ComTech: Computer,
Mathematics and Engineering
Applications 4(2): 812.
Suharmanto. 2019. "The Influence Of
Information Systems, Data
Warehouses And Business Intelligence
On Organizational Performance (Case
Study at BAPPEDA Pemprov DKI
Jakarta)." JOURNAL OF LATERA ICT.
Thomsen, Erik. 2003. “BI’s Promised Land
Performance.Intelligent Enterprise 6:
2125.
Vercellis, Carlo. 2009. “Business Intelligence:
Data Mining and Optimization for
Decision Making. Business
Intelligence: Data Mining and
Optimization for Decision Making: 1
417.
Widianty. 2015. “Data Warehouse Design
with Kimball Method: Case Study of
Fahrenheit Manufacturing Systems.
ComTech: Computer, Mathematics
and Engineering Applications 6(4):
604.
Williams, Steve. 2016. Business Intelligence
Strategy and Big Data Analytics: a
General Management Perspective
Business Intelligence Strategy and Big
Data Analytics: A General
Management Perspective.
© 2022 by the authors. Submitted
for possible open access publication
under the terms and conditions of the Creative Commons Attribution (CC BY SA) license
(https://creativecommons.org/licenses/by-sa/4.0/).