[go: up one dir, main page]

CN104965687B - Big data processing method and processing device based on instruction set generation - Google Patents

Big data processing method and processing device based on instruction set generation Download PDF

Info

Publication number
CN104965687B
CN104965687B CN201510303600.XA CN201510303600A CN104965687B CN 104965687 B CN104965687 B CN 104965687B CN 201510303600 A CN201510303600 A CN 201510303600A CN 104965687 B CN104965687 B CN 104965687B
Authority
CN
China
Prior art keywords
instruction
function
big data
time function
intermediate representation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510303600.XA
Other languages
Chinese (zh)
Other versions
CN104965687A (en
Inventor
季桃桃
余佳阳
霍卫平
金正皓
郭志弘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING BONC TECHNOLOGY Co Ltd
Original Assignee
BEIJING BONC TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING BONC TECHNOLOGY Co Ltd filed Critical BEIJING BONC TECHNOLOGY Co Ltd
Priority to CN201510303600.XA priority Critical patent/CN104965687B/en
Publication of CN104965687A publication Critical patent/CN104965687A/en
Application granted granted Critical
Publication of CN104965687B publication Critical patent/CN104965687B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Devices For Executing Special Programs (AREA)

Abstract

The present invention relates to a kind of big data processing method and processing device based on instruction set generation, to solve the problems, such as how to improve big data processing speed.This method includes generation Just-In-Time function;Handled using Just-In-Time function pair big data;Generation Just-In-Time function includes:Establish instruction database;Determine that function performs flow according to user's request;The intermediate representation in flow selection instruction database is performed according to function to instruct, and generates intermediate representation instruction set;Intermediate representation instruction set is generated into Just-In-Time function.Just-In-Time function is designed according to user's request, and part intermediate representation instruction is only have chosen in instruction database, for generating intermediate representation instruction set, and then obtains Just-In-Time function.Due to contains only in the Just-In-Time function of the present invention with this relevant instruction of processing work, relative to general program, greatly reduce the quantity of instruction, thus when using Just-In-Time function progress big data processing, processing speed is greatly enhanced.

Description

Big data processing method and processing device based on instruction set generation
Technical field
The present invention relates to a kind of big data processing method and processing device based on instruction set generation.
Background technology
Big data processing, which is cleaned primarily directed to mass data, stored, calculated and analyze etc., to be operated.Due to data by Polytype, such as Int, Long, Double, Char and VarChar, even complex Numner.And data Computing also have it is a variety of, such as mathematical operation, logical operation, comparison operation, type conversion etc..So programmer is carrying out big data When writing work of processing routine, typically general program can be write for various data types, various computings, to avoid the occurrence of not The problem of primary data type, unknown computing or other unknown situations cause program crashing occurs, so this general program compares It is complicated cumbersome.
When carrying out certain specific processing work using this general program, such as inquiry, filtering, computing etc., due to Comprising instruction constant much unrelated with particular procedure work in general program, so the speed of data processing can be slow.
The content of the invention
The technical problems to be solved by the invention are:How the speed of big data processing is improved.
In order to solve the above technical problems, the present invention proposes a kind of big data processing method based on instruction set generation, bag Include:
Generate Just-In-Time function;
Handled using the Just-In-Time function pair big data;
Wherein, the generation Just-In-Time function includes:
Instruction database is established, the instruction database internal memory contains a variety of intermediate representation instructions;
Determine that corresponding function performs flow according to user's request;
The intermediate representation for performing flow path match in the flow selection instruction database with the function is performed according to the function Instruction, generate intermediate representation instruction set;
The intermediate representation instruction set is generated into the Just-In-Time function.
Further, the intermediate representation instruction is to be stored in after encapsulating in the instruction database.
Further, it is described to determine that corresponding function performs flow, including:
Expression tree is generated according to user's request;
The expression tree is traveled through, obtains the attribute information of the expression tree;
According to the expression tree and its attribute information, determine that the function performs flow.
Further, the Just-In-Time function is generated using underlying virtual machine Just-In-Time method.
Further, it is described handled using the Just-In-Time function pair big data before, in addition to described Just-In-Time function carries out automatic vectorization processing, specifically includes:
It is described handled using the Just-In-Time function pair big data before, in addition to automatic vectorization processing, Specifically include:
Analyze the concurrency of the user's request;
Corresponding automatic vectorization processing is added in the Just-In-Time function according to the analysis result of the concurrency Instruction.
In order to solve the above technical problems, the invention also provides it is a kind of based on instruction set generation big data processing unit, Including:
Function generation module, for generating Just-In-Time function;
Processing module, for being handled using the Just-In-Time function pair big data;
Wherein, the Just-In-Time function generation module includes:
Instruction database setting up submodule, for establishing instruction database, the instruction database internal memory contains a variety of intermediate representation instructions;
Flow determination sub-module, for determining that corresponding function performs flow according to user's request;
Instruction set generates submodule, is held for being performed according to the function in the flow selection instruction database with the function The intermediate representation instruction of row flow path match, generates intermediate representation instruction set;
Function generates submodule, for the intermediate representation instruction set to be generated into the Just-In-Time function.
Further, the intermediate representation instruction is to be stored in after encapsulating in the instruction database.
Further, the flow determination sub-module includes:
Generation unit is set, for generating expression tree according to user's request;
Attribute information acquiring unit, for traveling through the expression tree, obtain the attribute information of the expression tree;
Function performs flow determining unit, for according to the expression tree and its attribute information, determining that function is held Row flow.
Further, the Just-In-Time function is generated using underlying virtual machine Just-In-Time method.
Further, the device also includes vectorization processing module, for utilizing the Just-In-Time function pair described Before big data is handled, automatic vectorization processing is carried out;
Concurrency analyzes submodule, for analyzing the concurrency of the user's request;
Instruction increase submodule, phase is added for the analysis result according to the concurrency in the Just-In-Time function The automatic vectorization process instruction answered.
The Just-In-Time function of the present invention is designed according to user's request, only be have chosen in instruction database in part Between represent instruction, for generating intermediate representation instruction set, and then obtain Just-In-Time function.Due to the Just-In-Time letter of the present invention Contains only in number with this relevant instruction of processing work, relative to general program, greatly reduce the quantity of instruction, thus When carrying out big data processing using the Just-In-Time function, processing speed is greatly enhanced.
Brief description of the drawings
The features and advantages of the present invention can be more clearly understood by reference to accompanying drawing, accompanying drawing is schematically without that should manage Solve to carry out any restrictions to the present invention, in the accompanying drawings:
Fig. 1 shows a kind of FB(flow block) of the big data processing method based on instruction set code generation of the present invention.
Fig. 2 shows a kind of FB(flow block) of the big data processing unit based on instruction set code generation of the present invention.
Embodiment
Below in conjunction with accompanying drawing, embodiments of the present invention is described in detail.
As shown in figure 1, the present invention provides a kind of big data processing method based on instruction set generation, this method includes:
Generate Just-In-Time function;
Handled using the Just-In-Time function pair big data;
Wherein, the generation Just-In-Time function includes:
Instruction database is established, the instruction database internal memory contains a variety of intermediate representation instructions;
Determine that corresponding function performs flow according to user's request;
The intermediate representation for performing flow path match in the flow selection instruction database with the function is performed according to the function Instruction, generate intermediate representation instruction set;
The intermediate representation instruction set is generated into the Just-In-Time function.
Wherein, " intermediate representation " English in intermediate representation instruction is Intermediate Representation.It is middle Represent that instruction is similar to assembler language, be the instruction that can be mapped directly to machine code, abbreviation IR is instructed.
For possible various situations, the type of intermediate representation instruction has:
(1) pointer type of various data types instructs, and all data types are all referring to pin type in IR grammers;
(2) type conversion instructions, type here change the conversion being primarily referred to as between value type;
(3) variable declarations, loading, store instruction, variable declarations here include global variable statement, local variable sound It is bright, it is whether global or local, data type all referring to pin type, so with loading instruct with store instruction come Carry out value and assignment;
(4) data computations, including+,-, * ,/, &, |, IR grammer Distinguish computings, for example, the addition of int types Different instructions is used with the addition of double types.
(5) instruction is compared, including>、<、<=,>=,==,!=;
(6) logic instruction:or、and;
(7) java standard library call instruction, including:Memcmp, memcpy, memset, floor, ceil, round etc.;
(8) static library call instruction;
Although IR is a kind of Optimized code, but its complexity makes the programming of programmer and safeguards relative difficulty, institute With the functional module more complicated to some, the present invention can be write with C++, then be called in a manner of static library for IR.In addition, The also call instruction of third party library, third party library are called in a manner of static library for IR, and the mode that static library calls is this hair It is bright that more services can be provided, a greater variety of functions are produced, meet more demands.
(9) flow control instructions, for example, the generation and jump instruction of basic block (BasicBlock), realize that for is circulated Instruction, realize if-else instruction etc..
The above-mentioned intermediate representation instruction performed according to the function in the flow selection instruction database, generation intermediate representation refer to Make the process of collection, the process that actually the intermediate representation instruction in instruction database is chosen, arranged and assembled.
The Just-In-Time function of the present invention is designed according to user's request, so only have chosen portion in instruction database Divide intermediate representation instruction, for generating intermediate representation instruction set, and then obtain Just-In-Time function.Due to the instant volume of the present invention Translate contains only in function with this relevant instruction of processing work, relative to general program, greatly reduce the quantity of instruction.
For example, for a table containing hundred million rows record, carried out according to some or certain several fields of this table Filter, a Just-In-Time function is produced for such user's request, this Just-In-Time letter when scanning every a line of the table Number will be called, so when the instruction strip number of Just-In-Time function is fewer, can greatly improve to big data processing Speed.
Designed the present invention be directed to a certain specific big data processing work, the specific big data processing work can Think the work such as specific inquiry, specific calculating.It is also understood that present invention employs custom strategies, to different places Science and engineering is made, and customizes out different Just-In-Time functions, the flexible policy that as particular problem specifically solves.
Further, the intermediate representation instruction is to be stored in after encapsulating in the instruction database.
Here, the present invention instructs to intermediate representation and does further restriction, and the intermediate representation instruction in instruction database is to be sealed Existing for the functional module of dress, the implication of intermediate representation instruction is so readily appreciated, improves the operating efficiency of programmer.Meanwhile Also improve the maintainability of code.
Further, the determination function performs flow, including:
Expression tree is generated according to user's request;
The expression tree is traveled through, obtains the attribute information of the expression tree;
According to the expression tree and its attribute information, determine that function performs flow.
Here, the invention provides a kind of method for determining function and performing flow.
Wherein, so-called user's request can be inquiry, filtering (for example, C1>5 filter out the number that first row is more than 5 According to), calculate (for example, C1+C2, that is, calculate the first column data and the second column data sum) or more complicated expression formula (for example, floor(C1)+C2>100th, casewhen) or other customized demand.
Wherein, so-called attribute information includes the information such as data type, data directory.
Wherein, so-called function perform flow refer to carry out which kind of computing, first do do any step after which is walked, when condition is set up why Do, condition not into when how to do etc..
In order to facilitate understanding, the process for determining that function performs flow is exemplified below, it should be understood that the example is simultaneously It is not the restriction to above-mentioned technical proposal.
For example, user needs to carry out a filtering in big table, filtering type is:C1+C2>C3.So-called big table refers to The tables of data that data volume is big and field is more.The implication of above-mentioned filtering type is that user's request is:Select first row in big table Data and the second column data sum are more than the row of the 3rd column data.Determining the method for function execution flow includes:Generate expression formula Tree, travels through the expression tree, obtains the attribute information of the tree:The equal Int types of C1, C2, C3, index are respectively 0,1,2.Thus To function perform flow be:Int, is then added by the row that computation index is 0 first and the Int sum operations that index is 1 row As a result Int is carried out more than compared with for 2 row with indexing.
Further, the intermediate representation instruction set is generated by the volume immediately using underlying virtual machine Just-In-Time technology Translate function.
Wherein, so-called underlying virtual machine Just-In-Time technology is Low Level Virtual Machine Just-in- Time compilation, abbreviation LLVM-JIT.Just-In-Time therein also referred to as compiling in time, in real time compiling or on-the-flier compiler.
Further, it is described handled using the Just-In-Time function pair big data before, in addition to from trend Quantification treatment, specifically include:
Analyze the concurrency of the user's request;
Corresponding automatic vectorization processing is added in the Just-In-Time function according to the analysis result of the concurrency Instruction.
Wherein, so-called automatic vectorization processing, English is Automatic Vectorization.The present invention can profit Automatic vectorization processing is carried out with CPU SIMD compilers.The compiler is according to the characteristics of Just-In-Time program, by circulating exhibition Open, mode the excavates concurrency in Just-In-Time function such as data dependence analysis and instruction are reset, and by the finger with concurrency The vector instruction of CPU supports is merged into order.
In order to facilitate understanding, benefit or convenience that automatic vectorization is handled is exemplified below, it should be understood that Following instance is not the restriction to above-mentioned technical proposal.
For example, user will carry out a simple calculating, calculating formula is:(C1+C2) * (C3-C4), the upper of the present invention is passed through State technical scheme and obtain Just-In-Time function corresponding with the user's request, by carrying out concurrency point to above-mentioned user's request Analysis, and corresponding automatic vectorization process instruction is added in Just-In-Time function, then utilize the Just-In-Time of automatic vectorization Function carry out big data processing process be:
(1) batch processing C1+C1, and by result cache into Tmp1;
(2) batch processing C3-C4, and by result cache into Tmp2;
(3) batch processing Tmp1*Tmp2.
, can be to data one when being handled if the Just-In-Time function does not add automatic vectorization process instruction The scanning of row a line, rather than multirow is subjected to batch processing, so automatic vectorization processing step causes the speed of data processing Rate is further enhanced.
It is of course also possible to which simply a part for user's request has concurrency, accordingly, add in Just-In-Time function Add the automatic vectorization process instruction of appropriate section, such as some calculating process is fairly simple in user's request expression formula, And another part is more complicated, fairly simple part can be with vectorization.The processing procedure of the fairly simple part is used into C++ Write, form static library, called for IR.When being instructed corresponding to the execution part, automatic vectorization is triggered, and then carry out batch When handling, and going to instruction corresponding to complicated part, it is impossible to batch processing is carried out, can only computing line by line.Further, when running into class Like Casewhen is this need situation about handling line by line when, automatic vectorization can not be passed through and carry out batch processing, it is seen that be automatic Vectorization processing has certain limitation.
When entering row vector automation to Just-In-Time function using CPU SIMD compilers, it shall be noted that:
(1) cycle-index, compiler cycle-index information is reported to, in this way 4,8 integral multiple;
(2) alignment of data, compiler alignment of data mode is reported to improve data throughout;
(3) logic is simple, reduces unnecessary condition judgment code, avoids conditional jump, condition judgment, letter in circulation Number calling etc.;
(4) data dependence is eliminated, compiler data dependence relation is reported to and is not present;
(5) ambiguity of data access is eliminated using restrict keywords;
(6) suitable data manipulation type is selected, is not have as NEON is directed to 64bit double-precision floating points using SIMD instruction Gain;
(7) access mode of data is made rational planning for, as far as possible using sequential access mode, this is also data knot The problem of structure design needs to consider.
Realize that automatic vectorization can free programmer from bottom compiling using CPU SIMD compilers attentively to do Good code development, and CPU SIMD compiler processes are given cumbersome, repeated compiling work, improve operating efficiency.
Corresponding to the above method, present invention also offers to functional module framework corresponding to method, i.e., a kind of base is also provided In the big data processing unit of instruction set generation, as shown in Fig. 2 the device includes:
Function generation module, for generating Just-In-Time function;
Processing module, for being handled using the Just-In-Time function pair big data;
Wherein, the Just-In-Time function generation module includes:
Instruction database setting up submodule, for establishing instruction database, the instruction database internal memory contains a variety of intermediate representation instructions;
Flow determination sub-module, for determining that corresponding function performs flow according to user's request;
Instruction set generates submodule, is held for being performed according to the function in the flow selection instruction database with the function The intermediate representation instruction of row flow path match, generates intermediate representation instruction set;
Function generates submodule, for the intermediate representation instruction set to be generated into the Just-In-Time function.
Further, the instruction database setting up submodule includes:
Instruction generation unit, for generating all intermediate representation instructions;
Encapsulation unit, for being packaged to intermediate representation instruction.
Further, the intermediate representation instruction is to be stored in after encapsulating in the instruction database.
Further, the function generation submodule is referred to the intermediate representation using underlying virtual machine Just-In-Time method Order collection generates the Just-In-Time function.
Further, in addition to vectorization processing module, for utilizing the Just-In-Time function pair big data described Before being handled, automatic vectorization processing is carried out to the Just-In-Time function;
The vectorization processing module includes:
Vectorization feasibility analysis module, for analyzing whether user's request can carry out vectorization processing;
Instruction synthesis submodule, increase automatic vectorization instruction in the Just-In-Time functional procedure is generated;
It is succinct for specification, no longer said apparatus is carried out here the specific explanations such as the above method, description and Beneficial effect, related content refer to the corresponding content of above method technical scheme, not repeated herein.
In summary, the present invention has advantages below:
(1) present invention is customized Just-In-Time function according to user's request, utilizes the Just-In-Time function pair of the customization Big data is handled, because the instruction strip number of Just-In-Time function greatly reduces relative to general purpose function, so improving number According to the speed of processing, reduce the expense of time;
(2) the intermediate representation instruction in instruction database is packaged, improves the maintainability of instruction;
(3) batch processing to data is realized using automatic vectorization processing, further increases the speed of data processing.
It should be appreciated that the technical characteristic in technical scheme and technical scheme in the above method or device is not rushing , can in any combination on the premise of prominent, the technical scheme after combination is still within protection scope of the present invention.
Although being described in conjunction with the accompanying embodiments of the present invention, those skilled in the art can not depart from this hair Various modifications and variations are made in the case of bright spirit and scope, such modifications and variations are each fallen within by appended claims Within limited range.

Claims (8)

  1. A kind of 1. big data processing method based on instruction set generation, it is characterised in that including:
    Generate Just-In-Time function;
    Handled using the Just-In-Time function pair big data;
    Wherein, the generation Just-In-Time function includes:
    Instruction database is established, the instruction database internal memory contains a variety of intermediate representation instructions;
    Determine that corresponding function performs flow according to user's request;
    The intermediate representation instruction for performing flow path match in the flow selection instruction database with the function is performed according to the function, Generate intermediate representation instruction set;
    The intermediate representation instruction set is generated into the Just-In-Time function;
    It is described to determine that corresponding function performs flow, including:
    Expression tree is generated according to user's request;
    The expression tree is traveled through, obtains the attribute information of the expression tree;
    According to the expression tree and its attribute information, determine that the function performs flow.
  2. 2. big data processing method according to claim 1, it is characterised in that the intermediate representation instruction is deposited after encapsulating Storage is in the instruction database.
  3. 3. big data processing method according to claim 1, it is characterised in that the Just-In-Time function is to use bottom The generation of virtual machine Just-In-Time method.
  4. 4. big data processing method according to claim 1, it is characterised in that utilize the Just-In-Time function described Before handling big data, in addition to automatic vectorization processing, specifically include:
    Analyze the concurrency of the user's request;
    Corresponding automatic vectorization process instruction is added in the Just-In-Time function according to the analysis result of the concurrency.
  5. A kind of 5. big data processing unit based on instruction set generation, it is characterised in that including:
    Function generation module, for generating Just-In-Time function;
    Processing module, for being handled using the Just-In-Time function pair big data;
    Wherein, the Just-In-Time function generation module includes:
    Instruction database setting up submodule, for establishing instruction database, the instruction database internal memory contains a variety of intermediate representation instructions;
    Flow determination sub-module, for determining that corresponding function performs flow according to user's request;
    Instruction set generates submodule, and stream is performed with the function for being performed according to the function in the flow selection instruction database The intermediate representation instruction of journey matching, generates intermediate representation instruction set;
    Function generates submodule, for the intermediate representation instruction set to be generated into the Just-In-Time function;
    The flow determination sub-module includes:
    Generation unit is set, for generating expression tree according to user's request;
    Attribute information acquiring unit, for traveling through the expression tree, obtain the attribute information of the expression tree;
    Function performs flow determining unit, for according to the expression tree and its attribute information, determining that function performs stream Journey.
  6. 6. big data processing unit according to claim 5, it is characterised in that the intermediate representation instruction is deposited after encapsulating Storage is in the instruction database.
  7. 7. big data processing unit according to claim 5, it is characterised in that the Just-In-Time function is to use bottom The generation of virtual machine Just-In-Time method.
  8. 8. big data processing unit according to claim 5, it is characterised in that also including vectorization processing module, be used for It is described handled using the Just-In-Time function pair big data before, carry out automatic vectorization processing;
    The vectorization processing module includes:
    Concurrency analyzes submodule, for analyzing the concurrency of the user's request;
    Instruction increase submodule, is added accordingly for the analysis result according to the concurrency in the Just-In-Time function Automatic vectorization process instruction.
CN201510303600.XA 2015-06-04 2015-06-04 Big data processing method and processing device based on instruction set generation Active CN104965687B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510303600.XA CN104965687B (en) 2015-06-04 2015-06-04 Big data processing method and processing device based on instruction set generation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510303600.XA CN104965687B (en) 2015-06-04 2015-06-04 Big data processing method and processing device based on instruction set generation

Publications (2)

Publication Number Publication Date
CN104965687A CN104965687A (en) 2015-10-07
CN104965687B true CN104965687B (en) 2017-12-08

Family

ID=54219723

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510303600.XA Active CN104965687B (en) 2015-06-04 2015-06-04 Big data processing method and processing device based on instruction set generation

Country Status (1)

Country Link
CN (1) CN104965687B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109165220B (en) * 2018-08-09 2021-06-22 天津威努特信息技术有限公司 Data matching calculation method
CN109753306A (en) * 2018-12-28 2019-05-14 北京东方国信科技股份有限公司 A kind of big data processing method of because precompiled function caching engine
CN110597554A (en) * 2019-08-01 2019-12-20 浙江大学 A method for automatic generation and optimization of instruction function of instruction set simulator
CN112445483B (en) * 2019-08-27 2023-11-24 龙芯中科技术股份有限公司 Instruction generation method and device, electronic equipment and storage medium
CN111460454A (en) * 2020-03-13 2020-07-28 中国科学院计算技术研究所 A smart contract similarity retrieval method and system based on stack instruction sequence
CN111679858B (en) * 2020-05-27 2025-08-22 中国平安财产保险股份有限公司 Operation instruction processing method, device, computer equipment and storage medium
CN114924745A (en) * 2022-05-19 2022-08-19 北京百度网讯科技有限公司 Operating method, device and electronic device for deep learning compiler

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1309349A (en) * 2001-03-22 2001-08-22 北京北大天正科技发展有限公司 Method for on-line customization of software
CN1453699A (en) * 2002-04-26 2003-11-05 株式会社东芝 Generating method for developing environment in development on-chip system and media for storing the same program
JP2006091945A (en) * 2004-09-21 2006-04-06 Fuji Xerox Co Ltd Software processor
CN101876908A (en) * 2010-06-30 2010-11-03 中兴通讯股份有限公司 User customizing method and system
US8489543B2 (en) * 2005-08-12 2013-07-16 Sugarcrm Inc. Customer relationship management system and method
CN103207786A (en) * 2013-04-28 2013-07-17 中国人民解放军信息工程大学 Progressive intelligent backtracking vectorization code tuning method
CN103984541A (en) * 2014-04-14 2014-08-13 美的集团股份有限公司 Method and system for generating application procedure based on terminal source codes

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1309349A (en) * 2001-03-22 2001-08-22 北京北大天正科技发展有限公司 Method for on-line customization of software
CN1453699A (en) * 2002-04-26 2003-11-05 株式会社东芝 Generating method for developing environment in development on-chip system and media for storing the same program
JP2006091945A (en) * 2004-09-21 2006-04-06 Fuji Xerox Co Ltd Software processor
US8489543B2 (en) * 2005-08-12 2013-07-16 Sugarcrm Inc. Customer relationship management system and method
CN101876908A (en) * 2010-06-30 2010-11-03 中兴通讯股份有限公司 User customizing method and system
CN103207786A (en) * 2013-04-28 2013-07-17 中国人民解放军信息工程大学 Progressive intelligent backtracking vectorization code tuning method
CN103984541A (en) * 2014-04-14 2014-08-13 美的集团股份有限公司 Method and system for generating application procedure based on terminal source codes

Also Published As

Publication number Publication date
CN104965687A (en) 2015-10-07

Similar Documents

Publication Publication Date Title
CN104965687B (en) Big data processing method and processing device based on instruction set generation
US11036614B1 (en) Data control-oriented smart contract static analysis method and system
US6381739B1 (en) Method and apparatus for hierarchical restructuring of computer code
Lattner et al. Making context-sensitive points-to analysis with heap cloning practical for the real world
KR101360512B1 (en) Register allocation with simd architecture using write masks
US6530079B1 (en) Method for optimizing locks in computer programs
US5146594A (en) Method of producing object program based on interprocedural dataflow analysis of a source program
EP0810523A2 (en) Method for sequencing computer instruction execution in a data processing system
EP0229245A2 (en) Method for optimizing register allocation and assignments
Fluet et al. Implicitly threaded parallelism in Manticore
US8458671B1 (en) Method and system for stack back-tracing in computer programs
Leupers et al. Function inlining under code size constraints for embedded processors
CN109460237A (en) The Compilation Method and device of code
US7694288B2 (en) Static single assignment form pattern matcher
US20100250564A1 (en) Translating a comprehension into code for execution on a single instruction, multiple data (simd) execution
US6360360B1 (en) Object-oriented compiler mechanism for automatically selecting among multiple implementations of objects
Gómez-Zamalloa et al. Test case generation for object-oriented imperative languages in CLP
US20020062478A1 (en) Compiler for compiling source programs in an object-oriented programming language
JP5048949B2 (en) Software tools including asynchronous program flow modeling
Greiner et al. A provably time-efficient parallel implementation of full speculation
CN103530471B (en) A kind of CPA method based on simulator
Lindstrom Static evaluation of functional programs
CN101882190B (en) Method for formally verifying bytecode intermediate representation program module by module
Zhu et al. Locality analysis for parallel C programs
Setzer Java as a functional programming language

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Big data processing method and apparatus based on instruction set generation

Effective date of registration: 20190709

Granted publication date: 20171208

Pledgee: Zhongguancun Beijing technology financing Company limited by guarantee

Pledgor: BEIJING BONC TECHNOLOGY CO., LTD.

Registration number: 2019990000686

PC01 Cancellation of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Date of cancellation: 20211202

Granted publication date: 20171208

Pledgee: Zhongguancun Beijing technology financing Company limited by guarantee

Pledgor: BUSINESS-INTELLIGENCE OF ORIENTAL NATIONS Corp.,Ltd.

Registration number: 2019990000686