US5371883A - Method of testing programs in a distributed environment - Google Patents
Method of testing programs in a distributed environment Download PDFInfo
- Publication number
- US5371883A US5371883A US08/037,219 US3721993A US5371883A US 5371883 A US5371883 A US 5371883A US 3721993 A US3721993 A US 3721993A US 5371883 A US5371883 A US 5371883A
- Authority
- US
- United States
- Prior art keywords
- test
- program
- central repository
- machine
- input
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000010998 test method Methods 0.000 title abstract description 5
- 238000012360 testing method Methods 0.000 claims abstract description 204
- 238000012795 verification Methods 0.000 claims abstract description 32
- 238000000034 method Methods 0.000 claims description 25
- 238000011084 recovery Methods 0.000 claims description 14
- 238000012545 processing Methods 0.000 claims description 3
- 241000269627 Amphiuma means Species 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012163 sequencing technique Methods 0.000 description 2
- 101150044039 PF12 gene Proteins 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000007630 basic procedure Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000012956 testing procedure Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3668—Testing of software
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Definitions
- the present invention relates to a method of testing computer programs. More particularly, the invention relates to a method of testing programs in a distributed environment, using a single control machine to control testing and verification of results. Further, this invention relates to an integrated system of tracking and executing tests.
- Adequately testing a program involves exercising the function of that program under conditions which reflect the environment in which the program was designed to operate. For programs designed to operate on a single computer, testing requires use of actual configurations of machines on which the program was to be run (or a simulation of such configuration) and test data which reflected the range of possible inputs to the program. For programs larger in scale, consisting of multiple subsystems, each subsystem itself composed of multiple modules, adequately testing the whole involves testing each of the modules, integrating the modules into their respective subsystems, testing the subsystem, and finally testing the entire system.
- testing in a distributed environment presents additional problems.
- a distributed environment is one composed of multiple computer systems operating independently and connected by a common transmission line over which they communicate with each other. Testing a program in the same hardware configuration and operating conditions becomes extremely difficult.
- testing in a distributed environment commonly involved multiple test teams, each working at one of the distributed systems. When changes were made to application programs, testing the changes involved distributing the changed program and associated test cases and coordinating the results of each test team. Thus, considerable effort was required simply to maintain test case libraries and track test case execution status.
- the invention is comprised of a Control Program residing in a Control Machine.
- the Control Machine is connected to other Test Machines through communications link, thus forming a distributed network.
- One or more Test Programs may run in the Test Machines, or, alternately, in the Control machine.
- a central repository of information to create, maintain, and define the test information is also part of the invention. From this central repository comes data indicating which testcases are to be run, which is passed to the Control Program.
- the central repository may be associated with the Control Machine or one of the Test Machines. Stored in the central repository are Test Cases. These are specific sequences of inputs or items of data, which are designed to exercise a Test Program.
- the Control Program forwards instructions from a Test Case to one or more Test Programs to be executed.
- the Control Program maintains control of the sequence of execution.
- the results of the execution by the Test Program are reported back to the Control Program.
- the Control Program verifies whether the results are correct.
- the Control Program sends the further instructions to Test Programs. Logging of testcase results, tracking of the testcases performed, and coordinating the test cases are all performed by the Control Program. Result data is passed to the central repository for storage and later querying and reporting by users.
- FIG. 1 is a diagram of a distributed environment, comprised of a Control Machine and several Test machines, interconnected through a network.
- FIG. 2 is a flow diagram of the invention.
- FIG. 3 is a directed tree diagram of a logon procedure
- FIG. 1 is a diagram an illustrative configuration of a distributed processing system which represents the preferred embodiment of this invention. Shown in the figure is a network 100, to which are connected the Control Machine 1 and distributed processors or Test Machines 10, 20, 30 and 40. Control Program 2 is resident on Control Machine 1. Test machine 30 acts as a bridge machine, being also connected to network 200, and providing a means by which Test machines on network 100 can communicate with machines on network 200. Test Machines 50 and 60 are connected to network 200. Note that in the aforementioned diagram, any method of bridging/routing may be used, if supported by the underlying protocol used to attain communications between the various machines on the network.
- FIG. 2 is a flowchart of one embodiment of the invention.
- step 200 shows the creation and storage of Test Case 1 in the central repository.
- Control Machine 1 contains the central repository of all test cases and their expected results.
- the user interacts with the central repository to select which testcases are to be run by the Control Program.
- the Control program begins communicating with Test Program 10.
- step 230 the Control Program sends instructions to Test Program 10, defining the execution of one or more steps of Test Case 1.
- Test Program 10 report the results of the execution of previously defined steps back to the Control Program.
- step 250 the Control Program verifies the reported results, given the definition of proper results from the Test Case definition. As shown in step 260, one of two things may happen.
- the test result may have been the same as the expected result. In this case, the results are "correct” or the verification is deemed “successful". When this occurs, Control Program 1 continues with the exercise of the Test Case, sending the next instruction to Test Program 10. If, on the other hand, the test results are different from what was expected (i.e. the results were "incorrect” and the verification is deemed “unsuccessful"), then Control Program 1 will perform error recovery procedure. Depending on the results of the error recovery procedure, Control Program 1 will either instruct Test Program 10 to proceed with the test (step 270) or end the test (step 280), depending on start parameters.
- the Control Program 1 uses the logged data to communicate the results of the test to the central repository. The user may then query the central repository or look directly at the logs of the Control Program to determine the test result.
- Test Case The purpose of a Test Case is to exercise a Test Program in much the same way as it would be exercised in normal operation. That is, a Test Case is comprised of specified inputs to the Program (for example, a screen with an amount of money, an interest rate and a length of time which a user would normally input to a screen) and expected outputs (for example, a second screen with the interest calculated from the inputs).
- specified inputs to the Program for example, a screen with an amount of money, an interest rate and a length of time which a user would normally input to a screen
- expected outputs for example, a second screen with the interest calculated from the inputs.
- the process of creating test cases consists of several steps.
- the first step is to identify all externally recognizable functions performed by the system.
- the next step is to generate functional variations. This involves expanding on those functions to include variations of the input values, conditions, and even output values.
- the last step is to select combinations of these variations to form test cases.
- test case The nature of the specified inputs to a test case depend on the type of interface used by the software application to be tested. Generally speaking, there are two types of application interfaces: user interfaces (typically either a full-screen panel based interface or a command interface) and programming interfaces (CALL level interfaces).
- user interfaces typically either a full-screen panel based interface or a command interface
- programming interfaces CALL level interfaces
- the initial step in creating a Test Case is to define to the central repository the panels, commands and programming interface calls used by the program to be tested.
- Each of these definitions (of a panel, command, or call) is referred to as a sys/def.
- the definitions include all fields or parameters that are used for input and/or output.
- each field of a panel, command or call is further described, with the components of the description depending on the requirements of the test environment.
- each field has an associated name and length (number of characters).
- the description of each field consists of identifying it as to data field type (input, output, or both), field type (specific to environment), selection field type (single or multiple selections allowed), etc. Commands are defined as input only. For programming interfaces, the fields are defined as input, output or both.
- Each value definition has a name and a value.
- definitions can be grouped into classes.
- the definitions of each field or parameter are stored as a unique entry in the central repository.
- the user specifies those values (previously defined) which are applicable to each particular variation.
- Test cases are built on the definitions discussed in the preceding paragraphs.
- the user generates a test case by selecting variations, and combinations of variations. For example, a user could select one or more variations, variation classes, sys/defs, sys/def classes or nodes.
- a selection of a sys/def or node results in the controlling machine constructing paths that connect all select objects.
- the defined directed trees are used to extend the constructed paths to the top level of the directed tree and the bottom level, thus providing the entire control flow.
- a test case is constructed by starting at the top point of the directed tree and flowing downward. Each path to the lowest point of the tree is considered a separate test case.
- Each sys/def of each variation on the given path can be equated to a "step" the program must execute.
- the user In addition to specifying the variations which are to be included in a test case, the user must specify the values to be used. These values are selected from the corresponding variation value definitions. If data relation definitions exist, the user can use those definition to specify more than one value at a time.
- test case rules Each variation selection the user makes is affected by test case rules, as defined by the user. That is, a test case rule defines the exact combinations of variations allowed in combination.
- the test case rules may include:
- variation A can follow variation B (default)
- Test Cases are stored in the central repository.
- the Control Program 1 at the Control Machine will communicate with the Test Program 10, executing the test as previously described.
- Each sys/def in the Test Case definition will be executed, with results being returned to the Control Program 1.
- Control Program 1 compares the returned result against the expected result, i.e. verifies the result.
- Verification takes place after the execution of the first step is executed in Test Case 1. For each variation, multiple verification actions can be defined and the execution order can be specified. Examples of some of the verifications that can occur are as follows:
- Panel verification is normally done for a panel test. In this type of verification, the entire screen or panel is compared with a predefined panel. Another alternative is for certain portions of the screen to be "blocked out" and the comparison performed only on the remaining portions of the screen.
- the returned information is checked for certain strings of text.
- the information may be searched for at a specific location on the returned panel. Alternatively, the entire screen may be searched for the string of text.
- each field value to be checked must be captured from the panel. (This is typically accomplished by providing the controlling machine with a panel with the field in question populated by the needed value.)
- each one of the parameters returned from the call may be checked for specific values.
- a program can be run in Control Program 1 which performs some user-defined procedure. Depending on the results of the procedure, the program returns a return code to the Control Program 1 which indicates the success or failure.
- Control Program 1 will continue sending instructions to Test Program 10 to continue the testing. On the other hand, if verification fails, error recovery procedures are executed.
- Error recovery could include any of:
- the Control Machine keeps track of each test case that is executed, maintaining details of its execution. These details include, for example, the date an time of execution, a summary of execution results, performance data saved during execution, and elapsed time of the test case execution. This data is automatically passed to the central repository.
- the user can produce many reports on the status of tests, the history of test execution, detailed reports on errors occurring during execution, environment under which a test has been run, and many other reports.
- the embodiment described above is a single station test, that is, a single test being executed through one Test Program.
- a modification of this embodiment comprises a simultaneous test using one or more Test Programs.
- a test case consists of a number of steps. These steps can be assigned to run a particular station number. These station numbers can be assigned to any particular Test Program running in any machine in the test configuration. That is, the user can describe the Test Case so that steps are executed by multiple systems, in varying order. This gives an opportunity of testing the interaction of various application interfaces with each other, and varying the machines involved in the testing without altering the definitions of the Test Cases.
- test steps In order to define a multi-station Test Case, the user generally starts with defined individual test steps. The user then defines dependency information between test steps. There are two types of such information: serial execution between test steps and parallel execution of test steps.
- defining a dependency means that a given test step must run before the dependent test case step can be run. If an individual test step involves some error checking that would adversely affect other test steps that might be running, that test step must run by itself. The requirement can be either that the test step must run alone on a given machine (no other test station on the specified test machine can be executing a test step at the same time) or that the test step must run alone on the Control program (which means that no other test step can be running).
- Control Program 1 will send Test Case 1 step 1 definition over network 100 to Test Program 10 and Test Case 1 step 2 definition over network 100 to Test Program 50.
- the Control Program will instruct the Test Programs to execute step 1 and step 2.
- Test Program 1 executes step 1 and returns the result to the Control Program.
- Test Program 2 executes step 2 and returns the result to the Control Program.
- Control Program 1 then performs verification of step 1 and step 2. Error recovery procedures will be defined to determine actions for all the permutations of correct and incorrect results.
- Test Machines 10, 20, 30, 40, 50, and 60 could all be executing Test Cases.
- Control Program 1 keeps track of the status of each Test Program as they execute the successive steps of their associated Test Cases simultaneously. As each of the Test Programs report results to Control Program 1, the results are verified and logged. Separate output logs are maintained for each executed Test Case. Reports can be printed out via the central repository. The Control Program sends the status of executed Test Cases back to the central repository.
- An additional modification of this embodiment comprises a Test Case being executed on the Control Machine, itself.
- test case is created, stored, and sent to other systems by automated means.
- Another method of testing which can be used is a user interactive, ad hoc testing procedure in which the user can interact with a program in the controlling machine, selecting a call to execute and data to pass to the system under test. These test cases are then saved, edited and tracked as any other test case.
- the above described invention provides a more convenient technique for testing programs in a distributed environment. As discussed above, by using the central repository of test cases and their expected results, the invention avoids the problems of keeping track of and coordinating differing test cases. It also provides a convenient means of executing multi-station testing, where the interaction of multiple test machines is at issue. Finally, by having all test results on a single system, it makes simple the creation of reports comparing the results of the various systems and tests.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
An improved method of testing in a distributed environment which is comprised of a Control Program residing in a Control Machine. The Control Machine also contains the central repository of information to control test execution in the Test Machines. The Control Program forwards instructions to a particular Test Program, residing in a Test Machine. The instructions are executed on that machine, and results are reported back to the Control Program. The Control Program verifies whether the results are correct. Depending on the results of the verification, the Control Program sends the test machine further instructions (to continue the test, stop the test, etc.). Logging the results of each test operation, keeping track of the tests performed, and coordinating the test cases are all performed on the Control Machine, by the Control Program.
Description
The present invention relates to a method of testing computer programs. More particularly, the invention relates to a method of testing programs in a distributed environment, using a single control machine to control testing and verification of results. Further, this invention relates to an integrated system of tracking and executing tests.
Adequately testing a program involves exercising the function of that program under conditions which reflect the environment in which the program was designed to operate. For programs designed to operate on a single computer, testing requires use of actual configurations of machines on which the program was to be run (or a simulation of such configuration) and test data which reflected the range of possible inputs to the program. For programs larger in scale, consisting of multiple subsystems, each subsystem itself composed of multiple modules, adequately testing the whole involves testing each of the modules, integrating the modules into their respective subsystems, testing the subsystem, and finally testing the entire system.
Testing in a distributed environment presents additional problems. A distributed environment is one composed of multiple computer systems operating independently and connected by a common transmission line over which they communicate with each other. Testing a program in the same hardware configuration and operating conditions becomes extremely difficult. (For a discussion of the problems see Mori, et al., Method and Apparatus for Testing a Distributed Computer System, U.S. Pat. No. 4,803,683.) In addition, in the prior art, testing in a distributed environment commonly involved multiple test teams, each working at one of the distributed systems. When changes were made to application programs, testing the changes involved distributing the changed program and associated test cases and coordinating the results of each test team. Thus, considerable effort was required simply to maintain test case libraries and track test case execution status.
Further complicating the problems related to adequately reflecting the operational environment, proper testing in a distributed environment requires precise timing and sequencing of test events, in order to produce consistent, checkable results. This has been difficult, even impossible to achieve using humans and standalone tools. Another challenge to testers is the task of tracking testcase status across multiple sets of distributed systems, perhaps including several different versions of the software system, has proven to be difficult and time-consuming.
It is therefore an object of the invention to provide an improved testing method for a distributed environment by providing a centralized test environment with a single control process which exclusively governs test sequencing and timing.
It is a further object of the invention to allow easier synchronization of the execution of testing across multiple machines.
It is a further object of the invention to provide a centralized database of test case information, to be used in planning, tracking, and as input to the control process.
It is a further object to provide a central point at which all verification of test results is performed, simplifying the complexity of test programs and allowing for a more realistic measurement of performance of the system being testing.
The invention is comprised of a Control Program residing in a Control Machine. The Control Machine is connected to other Test Machines through communications link, thus forming a distributed network. One or more Test Programs may run in the Test Machines, or, alternately, in the Control machine. A central repository of information to create, maintain, and define the test information is also part of the invention. From this central repository comes data indicating which testcases are to be run, which is passed to the Control Program. The central repository may be associated with the Control Machine or one of the Test Machines. Stored in the central repository are Test Cases. These are specific sequences of inputs or items of data, which are designed to exercise a Test Program.
The Control Program forwards instructions from a Test Case to one or more Test Programs to be executed. Thus, the Control Program maintains control of the sequence of execution. The results of the execution by the Test Program are reported back to the Control Program. The Control Program verifies whether the results are correct. Depending on the results of the verification and startup parameters, the Control Program sends the further instructions to Test Programs. Logging of testcase results, tracking of the testcases performed, and coordinating the test cases are all performed by the Control Program. Result data is passed to the central repository for storage and later querying and reporting by users.
The foregoing and other advantages of the invention will be more fully understood with reference to the description of the preferred embodiment and with reference to the accompanying drawings described in the following section.
FIG. 1 is a diagram of a distributed environment, comprised of a Control Machine and several Test machines, interconnected through a network.
FIG. 2 is a flow diagram of the invention.
FIG. 3 is a directed tree diagram of a logon procedure
FIG. 1 is a diagram an illustrative configuration of a distributed processing system which represents the preferred embodiment of this invention. Shown in the figure is a network 100, to which are connected the Control Machine 1 and distributed processors or Test Machines 10, 20, 30 and 40. Control Program 2 is resident on Control Machine 1. Test machine 30 acts as a bridge machine, being also connected to network 200, and providing a means by which Test machines on network 100 can communicate with machines on network 200. Test Machines 50 and 60 are connected to network 200. Note that in the aforementioned diagram, any method of bridging/routing may be used, if supported by the underlying protocol used to attain communications between the various machines on the network.
FIG. 2 is a flowchart of one embodiment of the invention. In FIG. 2, step 200 shows the creation and storage of Test Case 1 in the central repository. Thus, in the preferred embodiment, Control Machine 1 contains the central repository of all test cases and their expected results. In step 210, the user interacts with the central repository to select which testcases are to be run by the Control Program. In step 220, the Control program begins communicating with Test Program 10. In step 230, the Control Program sends instructions to Test Program 10, defining the execution of one or more steps of Test Case 1. In step 240, Test Program 10 report the results of the execution of previously defined steps back to the Control Program. In step 250, the Control Program verifies the reported results, given the definition of proper results from the Test Case definition. As shown in step 260, one of two things may happen. The test result may have been the same as the expected result. In this case, the results are "correct" or the verification is deemed "successful". When this occurs, Control Program 1 continues with the exercise of the Test Case, sending the next instruction to Test Program 10. If, on the other hand, the test results are different from what was expected (i.e. the results were "incorrect" and the verification is deemed "unsuccessful"), then Control Program 1 will perform error recovery procedure. Depending on the results of the error recovery procedure, Control Program 1 will either instruct Test Program 10 to proceed with the test (step 270) or end the test (step 280), depending on start parameters. At the end of the test (that is, the point at which either the entire test case has been executed or a stop-test decision has been made and executed), the Control Program 1 uses the logged data to communicate the results of the test to the central repository. The user may then query the central repository or look directly at the logs of the Control Program to determine the test result.
The following is a more detailed discussion of some of the steps enumerated above.
The purpose of a Test Case is to exercise a Test Program in much the same way as it would be exercised in normal operation. That is, a Test Case is comprised of specified inputs to the Program (for example, a screen with an amount of money, an interest rate and a length of time which a user would normally input to a screen) and expected outputs (for example, a second screen with the interest calculated from the inputs).
Accordingly, the process of creating test cases consists of several steps. The first step is to identify all externally recognizable functions performed by the system. The next step is to generate functional variations. This involves expanding on those functions to include variations of the input values, conditions, and even output values. The last step is to select combinations of these variations to form test cases.
The nature of the specified inputs to a test case depend on the type of interface used by the software application to be tested. Generally speaking, there are two types of application interfaces: user interfaces (typically either a full-screen panel based interface or a command interface) and programming interfaces (CALL level interfaces).
The initial step in creating a Test Case is to define to the central repository the panels, commands and programming interface calls used by the program to be tested. Each of these definitions (of a panel, command, or call) is referred to as a sys/def. The definitions include all fields or parameters that are used for input and/or output.
Each field of a panel, command or call is further described, with the components of the description depending on the requirements of the test environment. Typically, each field has an associated name and length (number of characters). In any environment, the description of each field consists of identifying it as to data field type (input, output, or both), field type (specific to environment), selection field type (single or multiple selections allowed), etc. Commands are defined as input only. For programming interfaces, the fields are defined as input, output or both.
For each defined field, all values that must be used in any test case must be defined. Each value definition has a name and a value. In addition, such definitions can be grouped into classes.
Once the definitions of each field or parameter is completed, the definitions are stored as a unique entry in the central repository.
Following the completion of the sys/defs, variations--each of which describes the movement from a first sys/def to a second--are defined. Variations reflect the sequence of actions which are to be tested. For example, consider a typical logon procedure. The panels required would be the logon panel (sys/def 1), the help panel (sys/def 2), the main menu (sys/def 3), and an error panel (sys/def 4). Referring to the directed tree diagram FIG. 3, panel variations for a logon procedure could be defined as follows:
start at the logon panel 300 (sys/def 1)
enter PF1 310, this brings the user to the help panel 320 (sys/def 2)
enter PF12 330, returning to logon panel 300 (sys/def 1)
enter an invalid userid and/or password 340, bringing the user to an error panel 350 (sys/def 4)
enter a valid userid and password 370, bringing the user to the main menu 380 (sys/def 3).
Once the variations are defined, the user specifies those values (previously defined) which are applicable to each particular variation.
Test cases are built on the definitions discussed in the preceding paragraphs. The user generates a test case by selecting variations, and combinations of variations. For example, a user could select one or more variations, variation classes, sys/defs, sys/def classes or nodes. A selection of a sys/def or node results in the controlling machine constructing paths that connect all select objects. The defined directed trees are used to extend the constructed paths to the top level of the directed tree and the bottom level, thus providing the entire control flow. A test case is constructed by starting at the top point of the directed tree and flowing downward. Each path to the lowest point of the tree is considered a separate test case. Each sys/def of each variation on the given path can be equated to a "step" the program must execute.
In addition to specifying the variations which are to be included in a test case, the user must specify the values to be used. These values are selected from the corresponding variation value definitions. If data relation definitions exist, the user can use those definition to specify more than one value at a time.
Each variation selection the user makes is affected by test case rules, as defined by the user. That is, a test case rule defines the exact combinations of variations allowed in combination. Thus, the test case rules may include:
include the following variation class (or individual variation)
exclude the following variation from the class previously selected
variation A must follow variation B
variation A cannot follow variation B
can start (command/call only: this provides the system with a "top" to the hierarchy where the testcase may be begun)
can end (command/call only)
variation A can follow variation B (default)
These definitions can aid the user in controlling the generation of test cases to exactly fit the current testing requirement.
Once defined, Test Cases are stored in the central repository. Referring to FIG. 1, the Control Program 1 at the Control Machine will communicate with the Test Program 10, executing the test as previously described. Each sys/def in the Test Case definition will be executed, with results being returned to the Control Program 1. Control Program 1 compares the returned result against the expected result, i.e. verifies the result.
Verification takes place after the execution of the first step is executed in Test Case 1. For each variation, multiple verification actions can be defined and the execution order can be specified. Examples of some of the verifications that can occur are as follows:
Panel verification is normally done for a panel test. In this type of verification, the entire screen or panel is compared with a predefined panel. Another alternative is for certain portions of the screen to be "blocked out" and the comparison performed only on the remaining portions of the screen.
For command and panel tests, the returned information is checked for certain strings of text. In panel tests, the information may be searched for at a specific location on the returned panel. Alternatively, the entire screen may be searched for the string of text.
For panel tests, individual fields may be verified. For text panels, field length and cursor location are defined. For graphic panels, each field value to be checked must be captured from the panel. (This is typically accomplished by providing the controlling machine with a panel with the field in question populated by the needed value.)
For programming interface calls, each one of the parameters returned from the call may be checked for specific values.
Any two files, existing anywhere on the token ring, can be compared and verified.
A program can be run in Control Program 1 which performs some user-defined procedure. Depending on the results of the procedure, the program returns a return code to the Control Program 1 which indicates the success or failure.
If the verification is successful, Control Program 1 will continue sending instructions to Test Program 10 to continue the testing. On the other hand, if verification fails, error recovery procedures are executed.
Error recovery could include any of:
logging the error
ending test execution
continuing test execution
executing defined error recovery procedures
As part of its function, the Control Machine keeps track of each test case that is executed, maintaining details of its execution. These details include, for example, the date an time of execution, a summary of execution results, performance data saved during execution, and elapsed time of the test case execution. This data is automatically passed to the central repository.
Using the central repository of information, the user can produce many reports on the status of tests, the history of test execution, detailed reports on errors occurring during execution, environment under which a test has been run, and many other reports.
Depending on the requirements of the user, additional hardware could be added to the above described system. There is no limit as to the hardware configuration supported by the invention, as long as the various Control and Test Machines can communicate over the selected communications protocol. If access to a host-based local reporting system is required, then the central repository must be connected to that host system using required hardware and software emulators.
The embodiment described above is a single station test, that is, a single test being executed through one Test Program. A modification of this embodiment comprises a simultaneous test using one or more Test Programs. In this embodiment, a test case consists of a number of steps. These steps can be assigned to run a particular station number. These station numbers can be assigned to any particular Test Program running in any machine in the test configuration. That is, the user can describe the Test Case so that steps are executed by multiple systems, in varying order. This gives an opportunity of testing the interaction of various application interfaces with each other, and varying the machines involved in the testing without altering the definitions of the Test Cases.
In order to define a multi-station Test Case, the user generally starts with defined individual test steps. The user then defines dependency information between test steps. There are two types of such information: serial execution between test steps and parallel execution of test steps.
In the case of serial execution between test steps, defining a dependency means that a given test step must run before the dependent test case step can be run. If an individual test step involves some error checking that would adversely affect other test steps that might be running, that test step must run by itself. The requirement can be either that the test step must run alone on a given machine (no other test station on the specified test machine can be executing a test step at the same time) or that the test step must run alone on the Control program (which means that no other test step can be running).
In the simplest version of a multi-station test, once the appropriate Test Cases are stored in the central repository, Control Program 1 will send Test Case 1 step 1 definition over network 100 to Test Program 10 and Test Case 1 step 2 definition over network 100 to Test Program 50. Depending on the definition of the dependency of the steps, the Control Program will instruct the Test Programs to execute step 1 and step 2. Test Program 1 executes step 1 and returns the result to the Control Program. Test Program 2 executes step 2 and returns the result to the Control Program. Control Program 1 then performs verification of step 1 and step 2. Error recovery procedures will be defined to determine actions for all the permutations of correct and incorrect results.
Clearly the number of Test Cases and Test Programs engaged in any test is limited only by the number of machines attached to the Control Machine by some network connection. In FIG. 1, Test Machines 10, 20, 30, 40, 50, and 60 could all be executing Test Cases. Control Program 1 keeps track of the status of each Test Program as they execute the successive steps of their associated Test Cases simultaneously. As each of the Test Programs report results to Control Program 1, the results are verified and logged. Separate output logs are maintained for each executed Test Case. Reports can be printed out via the central repository. The Control Program sends the status of executed Test Cases back to the central repository.
An additional modification of this embodiment comprises a Test Case being executed on the Control Machine, itself.
The basic procedure described in both the single station and multi-station testing environment is one in which a test case is created, stored, and sent to other systems by automated means. Another method of testing which can be used is a user interactive, ad hoc testing procedure in which the user can interact with a program in the controlling machine, selecting a call to execute and data to pass to the system under test. These test cases are then saved, edited and tracked as any other test case.
The above described invention provides a more convenient technique for testing programs in a distributed environment. As discussed above, by using the central repository of test cases and their expected results, the invention avoids the problems of keeping track of and coordinating differing test cases. It also provides a convenient means of executing multi-station testing, where the interaction of multiple test machines is at issue. Finally, by having all test results on a single system, it makes simple the creation of reports comparing the results of the various systems and tests.
It should also be noted that since the error recovery is performed at the Control Program, traffic over the network is minimized. That is, additional messages need not be sent from the Test Program to the Control Program to determine error recovery actions. In addition, expert systems or artificial intelligence can be installed in the Control Program to deal with the error--without requiring this program to be installed at each of the Test Machines.
While the invention has been particularly shown and described with reference to the preferred embodiment thereof, it will be understood by those skilled in the art that various changes in detail may be made therein without departing from the spirit, scope, and teaching of the invention. Accordingly, the system and method herein disclosed is to be considered merely as illustrative, and the invention is to be limited only as specified in the claim.
Claims (13)
1. In a distributed processing system comprising a Control Machine and one or more Test Machines, said Control Machine and said Test Machine connected to each other through a network, a method for testing software comprising the steps of:
a. storing one or more Test Cases in a central repository, each of said Test Cases comprised of a sequence of one or more inputs, said sequence having a first input and a second input, and each of said inputs having an associated expected output;
b. storing one or more error recovery instructions in said central repository, associating each of said instructions with one or more inputs;
c. in said central repository, selecting a Test Case to be executed, and sending said Test Case to a Control Program in a Control Machine;
d. sending said first input from said Control Program over a network to a first Test Program in a Test Machine;
e. executing said first input in said first Test Program, said execution resulting in a first test result;
f. sending said first test result from said first Test Program to said Control Program over said network;
g. verifying said first test result in said Control Program by comparing said first test result with said associated expected result of said first input if said first test result is the same as said associated expected result, said verification being deemed successful if said first test result differs from said associated expected result, said verification being deemed unsuccessful;
h. if said verification is successful, sending said second input from said Control Program to said first Test Program; and
i. if said verification is unsuccessful, selecting said error recovery instruction associated with said first input and sending said error recovery instruction associated with said first input from said Control Program to said first Test Program.
2. A method as in claim 1 further comprising the steps of said Control Program logging said verification in a log stored in said Control Machine, and said Control Program sending said first test result to said central repository.
3. A method as in claim 2 further comprising the steps of producing a report from said log stored in the said central repository.
4. A method as in claim 1, further comprising the steps of:
in said central repository, selecting a second Test Case to be executed, and sending said second Test Case to said Control Program; and
executing steps d through i with said second Test Case inputs and said first Test Program.
5. A method as in claim 1, further comprising the steps of:
sending said second Test Case inputs to a second Test Program on a second Test Machine and executing steps d through i with said second Test Case inputs and said second Test Program.
6. A method as in claim 5 further comprising the step of said Control Program logging each of said verifications in one of a plurality of logs stored in said Control Machine, and sending the results of execution of said first Test Case and said second Test Case to said central repository.
7. A method as in claim 6 further comprising the steps of producing a report from said logs.
8. In a distributed processing system comprising a Control Machine and a plurality of Test Machines, said Control Machine and said Test Machines connected to each other through a network, an improvement comprising:
a. means for storing one of more Test Cases in said central repository, each of said Test Cases comprised of a sequence of one or more inputs, said sequence having a first input and a second input, and each of said inputs having an associated expected result;
b. means for storing one or more error recovery instructions on said central repository, each of said instructions being associated with one or more of said inputs;
c. means in said Control Machine in for selecting a first Test Case;
d. means in said Control Machine for sending said first input of said first Test Case over said a network to a first Test Machine:
e. means in said first Test Machine for executing said first input, said execution resulting in a first test result;
f. means in said first Test Machine for sending said first test result to said Control Machine over said network;
g. means in said Control Machine for verifying said first test result by comparing said first test result with said associated expected result of said first input, where said verification is deemed successful if said first test result is the same as said associated expected result or said verification is deemed unsuccessful If said first test result differs from said associated expected result;
h. means in said Control Machine to send said second input to said first Test Program if said verification is deemed successful; and
i. means in said Control Machine of executing said error recovery instruction associated with said first input if said verification is deemed unsuccessful.
9. A system as in claim 8 further comprising a means in said Control Machine to log said verification in a log stored in said central repository, and means in said Control Machine to store said first test result in said central repository.
10. A system as in claim 9 further comprising a means of producing a report from said central repository.
11. A system as in claim 8, in which said selection means also selects a second Test Case and the means listed in steps d through i provide the same function to said second Test Case.
12. A system as in claim 11 further comprising means in said Control Machine to log each of said verifications of said first test results of said first and second Test Cases in one of a plurality of logs stored in said central repository and means of storing said test results of said first and second Test Cases in said central repository.
13. A system as in claim 12 further comprising means of producing a report consisting of data from said central repository.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/037,219 US5371883A (en) | 1993-03-26 | 1993-03-26 | Method of testing programs in a distributed environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/037,219 US5371883A (en) | 1993-03-26 | 1993-03-26 | Method of testing programs in a distributed environment |
Publications (1)
Publication Number | Publication Date |
---|---|
US5371883A true US5371883A (en) | 1994-12-06 |
Family
ID=21893122
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/037,219 Expired - Fee Related US5371883A (en) | 1993-03-26 | 1993-03-26 | Method of testing programs in a distributed environment |
Country Status (1)
Country | Link |
---|---|
US (1) | US5371883A (en) |
Cited By (147)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1996038733A1 (en) * | 1995-06-02 | 1996-12-05 | Pure Software, Inc. | Remote monitoring of computer programs |
US5598333A (en) * | 1994-02-22 | 1997-01-28 | Marsico, Jr.; Michael | Apparatus and method for electronically tracking and duplicating user input to an interactive electronic device |
US5617534A (en) * | 1994-02-16 | 1997-04-01 | Intel Corporation | Interface protocol for testing of a cache memory |
US5671351A (en) * | 1995-04-13 | 1997-09-23 | Texas Instruments Incorporated | System and method for automated testing and monitoring of software applications |
US5673386A (en) * | 1994-06-29 | 1997-09-30 | U S West Technologies, Inc. | Method and system for identification of software application faults |
WO1997037289A1 (en) * | 1996-03-29 | 1997-10-09 | Phase Metrics | Disk drive test sequence editor |
WO1997050034A1 (en) * | 1996-06-26 | 1997-12-31 | Mci Communications Corporation | Method and system for heterogeneous telecommunications network testing |
US5724273A (en) * | 1994-10-20 | 1998-03-03 | Tandem Computers Incorporated | Method and apparatus for analyzing test results obtained by applying a binary table and a suite of test scripts to a test subsystem control facility for a distributed systems network |
US5758156A (en) * | 1994-07-15 | 1998-05-26 | Fujitsu Limited | Method and apparatus of testing programs |
US5799266A (en) * | 1996-09-19 | 1998-08-25 | Sun Microsystems, Inc. | Automatic generation of test drivers |
US5841960A (en) * | 1994-10-17 | 1998-11-24 | Fujitsu Limited | Method of and apparartus for automatically generating test program |
US5864660A (en) * | 1996-03-12 | 1999-01-26 | Electronic Data Systems Corporation | Testing the integration of a plurality of elements in a computer system using a plurality of tests codes, each corresponding to an alternate product configuration for an associated element |
US5867710A (en) * | 1995-09-05 | 1999-02-02 | Motorola, Inc. | Portable microkernel operating system verification and testing |
US5872909A (en) * | 1995-01-24 | 1999-02-16 | Wind River Systems, Inc. | Logic analyzer for software |
US5881221A (en) * | 1996-12-31 | 1999-03-09 | Compaq Computer Corporation | Driver level diagnostics |
US5881219A (en) * | 1996-12-26 | 1999-03-09 | International Business Machines Corporation | Random reliability engine for testing distributed environments |
US5892949A (en) * | 1996-08-30 | 1999-04-06 | Schlumberger Technologies, Inc. | ATE test programming architecture |
US5896494A (en) * | 1996-12-31 | 1999-04-20 | Compaq Computer Corporation | Diagnostic module dispatcher |
US5909544A (en) * | 1995-08-23 | 1999-06-01 | Novell Inc. | Automated test harness |
US5937154A (en) * | 1997-03-05 | 1999-08-10 | Hewlett-Packard Company | Manufacturing functional testing of computing devices using microprogram based functional tests applied via the devices own emulation debug port |
US5974567A (en) * | 1997-06-20 | 1999-10-26 | Compaq Computer Corporation | Ghost partition |
US6002868A (en) * | 1996-12-31 | 1999-12-14 | Compaq Computer Corporation | Test definition tool |
US6023586A (en) * | 1998-02-10 | 2000-02-08 | Novell, Inc. | Integrity verifying and correcting software |
US6023773A (en) * | 1997-05-29 | 2000-02-08 | Advanced Micro Devices, Inc. | Multi-client test harness |
US6167537A (en) * | 1997-09-22 | 2000-12-26 | Hewlett-Packard Company | Communications protocol for an automated testing system |
US6182245B1 (en) * | 1998-08-31 | 2001-01-30 | Lsi Logic Corporation | Software test case client/server system and method |
US6263376B1 (en) * | 1997-02-24 | 2001-07-17 | Novell, Inc. | Generic run-time binding interpreter |
US20010014085A1 (en) * | 1999-10-08 | 2001-08-16 | Microsoft Corporation | Originator authentication |
US20010015977A1 (en) * | 1999-10-08 | 2001-08-23 | Stefan Johansson | Selective reception |
US6281894B1 (en) | 1999-08-31 | 2001-08-28 | Everdream, Inc. | Method and apparatus for configuring a hard disk and for providing support for a computer system |
US20010052089A1 (en) * | 2000-04-27 | 2001-12-13 | Microsoft Corporation | Automated testing |
US20020042897A1 (en) * | 2000-09-29 | 2002-04-11 | Tanisys Technology Inc. | Method and system for distributed testing of electronic devices |
US6393490B1 (en) * | 1997-12-18 | 2002-05-21 | Ian James Stiles | Method and system for a programmatic feedback process for end-user support |
US20020089968A1 (en) * | 2001-01-03 | 2002-07-11 | Hans Johansson | Method of inquiring |
US6425096B1 (en) * | 1998-12-31 | 2002-07-23 | Worldcom, Inc. | Method and system for audio portion test management in a client/server testing architecture |
US20030014510A1 (en) * | 2001-07-11 | 2003-01-16 | Sun Microsystems, Inc. | Distributed processing framework system |
US6510402B1 (en) * | 1999-02-04 | 2003-01-21 | International Business Machines Corporation | Component testing with a client system in an integrated test environment network |
US20030031153A1 (en) * | 2001-08-07 | 2003-02-13 | Nec Corporation | Program control system, program control method and information control program |
US20030037314A1 (en) * | 2001-08-01 | 2003-02-20 | International Business Machines Corporation | Method and apparatus for testing and evaluating a software component using an abstraction matrix |
US6532401B2 (en) * | 1997-06-04 | 2003-03-11 | Nativeminds, Inc. | Methods for automatically verifying the performance of a virtual robot |
US20030065981A1 (en) * | 2001-10-01 | 2003-04-03 | International Business Machines Corporation | Test tool and methods for testing a system-managed duplexed structure |
US20030098879A1 (en) * | 2001-11-29 | 2003-05-29 | I2 Technologies Us, Inc. | Distributed automated software graphical user interface (GUI) testing |
US6574578B1 (en) | 1999-02-04 | 2003-06-03 | International Business Machines Corporation | Server system for coordinating utilization of an integrated test environment for component testing |
US20030120776A1 (en) * | 2001-07-11 | 2003-06-26 | Sun Microsystems, Inc. | System controller for use in a distributed processing framework system and methods for implementing the same |
US20030120700A1 (en) * | 2001-09-11 | 2003-06-26 | Sun Microsystems, Inc. | Task grouping in a distributed processing framework system and methods for implementing the same |
US6601018B1 (en) | 1999-02-04 | 2003-07-29 | International Business Machines Corporation | Automatic test framework system and method in software component testing |
US6643802B1 (en) * | 2000-04-27 | 2003-11-04 | Ncr Corporation | Coordinated multinode dump collection in response to a fault |
US20030217308A1 (en) * | 2002-05-16 | 2003-11-20 | Sun Microsystems, Inc. | Distributed test harness model |
US6654914B1 (en) * | 1999-05-28 | 2003-11-25 | Teradyne, Inc. | Network fault isolation |
US6751794B1 (en) | 2000-05-25 | 2004-06-15 | Everdream Corporation | Intelligent patch checker |
US6775824B1 (en) | 2000-01-12 | 2004-08-10 | Empirix Inc. | Method and system for software object testing |
US6779134B1 (en) * | 2000-06-27 | 2004-08-17 | Ati International Srl | Software test system and method |
US20040205725A1 (en) * | 2003-04-14 | 2004-10-14 | Lambert John Robert | Automatic determination of invalid call sequences in software components |
US20040255201A1 (en) * | 2003-06-12 | 2004-12-16 | Win-Harn Liu | System and method for performing product tests utilizing a single storage device |
US20040260982A1 (en) * | 2003-06-19 | 2004-12-23 | Sun Microsystems, Inc. | System and method for scenario generation in a distributed system |
US20050015675A1 (en) * | 2003-07-03 | 2005-01-20 | Kolawa Adam K. | Method and system for automatic error prevention for computer software |
US20050015767A1 (en) * | 2003-07-01 | 2005-01-20 | Brian Nash | Operating system configuration tool |
US6859922B1 (en) * | 1999-08-30 | 2005-02-22 | Empirix Inc. | Method of providing software testing services |
US20050066233A1 (en) * | 2003-09-19 | 2005-03-24 | International Business Machines Corporation | Method, apparatus and computer program product for implementing autonomic testing and verification of software fix programs |
US6961937B2 (en) | 2001-07-11 | 2005-11-01 | Sun Microsystems, Inc. | Registry service for use in a distributed processing framework system and methods for implementing the same |
US7039912B1 (en) * | 1998-05-12 | 2006-05-02 | Apple Computer, Inc. | Integrated computer testing and task management systems |
US20060130131A1 (en) * | 2004-12-10 | 2006-06-15 | Microsoft Corporation | Token generation method and apparatus |
US7095718B1 (en) * | 2000-04-29 | 2006-08-22 | Hewlett-Packard Development Company, L.P. | Client/server scan software architecture |
US7114159B2 (en) | 2001-07-11 | 2006-09-26 | Sun Microsystems, Inc. | Processing resource for use in a distributed processing framework system and methods for implementing the same |
US7127506B1 (en) | 1999-05-28 | 2006-10-24 | Teradyne, Inc. | PC configuration fault analysis |
US20060253760A1 (en) * | 2005-05-09 | 2006-11-09 | Microsoft Corporation | System and methods for processing software authorization and error feedback |
US20060271918A1 (en) * | 2005-05-26 | 2006-11-30 | United Parcel Service Of America, Inc. | Software process monitor |
US20060271916A1 (en) * | 2005-05-26 | 2006-11-30 | United Parcel Service Of America, Inc. | Software process monitor |
US20060271205A1 (en) * | 2005-05-26 | 2006-11-30 | United Parcel Service Of America, Inc. | Software process monitor |
US20060294434A1 (en) * | 2005-06-28 | 2006-12-28 | Fujitsu Limited | Test recording method and device, and computer-readable recording medium storing test recording program |
US7240336B1 (en) | 2000-07-25 | 2007-07-03 | Sci Systems, Inc. | Interpretive simulation of software download process |
US7243137B2 (en) | 2001-09-28 | 2007-07-10 | Sun Microsystems, Inc. | Remote system controller and data center and methods for implementing the same |
US20070168734A1 (en) * | 2005-11-17 | 2007-07-19 | Phil Vasile | Apparatus, system, and method for persistent testing with progressive environment sterilzation |
US20070168733A1 (en) * | 2005-12-09 | 2007-07-19 | Devins Robert J | Method and system of coherent design verification of inter-cluster interactions |
US20070214391A1 (en) * | 2006-03-10 | 2007-09-13 | International Business Machines Corporation | Method and apparatus for testing software |
US20070244663A1 (en) * | 2006-04-12 | 2007-10-18 | Ati Technologies Inc. | Software or hardware test apparatus and method |
US7308608B1 (en) * | 2002-05-01 | 2007-12-11 | Cypress Semiconductor Corporation | Reconfigurable testing system and method |
US20080022261A1 (en) * | 2006-06-22 | 2008-01-24 | Thomas Michael Gooding | Method and Apparatus for Analyzing Error Conditions in a Massively Parallel Computer System by Identifying Anomalous Nodes Within a Communicator Set |
US20080098358A1 (en) * | 2006-09-29 | 2008-04-24 | Sap Ag | Method and system for providing a common structure for trace data |
US20080109681A1 (en) * | 2003-11-26 | 2008-05-08 | International Business Machines Corporation | Apparatus for Adaptive Problem Determination in Distributed Service-Based Applications |
US20080127108A1 (en) * | 2006-09-29 | 2008-05-29 | Sap Ag | Common performance trace mechanism |
US20080127110A1 (en) * | 2006-09-29 | 2008-05-29 | Sap Ag | Method and system for generating a common trace data format |
US20080155350A1 (en) * | 2006-09-29 | 2008-06-26 | Ventsislav Ivanov | Enabling tracing operations in clusters of servers |
US20080155348A1 (en) * | 2006-09-29 | 2008-06-26 | Ventsislav Ivanov | Tracing operations in multiple computer systems |
US20080244233A1 (en) * | 2007-03-26 | 2008-10-02 | Microsoft Corporation | Machine cluster topology representation for automated testing |
US20080256340A1 (en) * | 2007-04-13 | 2008-10-16 | Microsoft Corporation | Distributed File Fuzzing |
US20080259998A1 (en) * | 2007-04-17 | 2008-10-23 | Cypress Semiconductor Corp. | Temperature sensor with digital bandgap |
US20080297388A1 (en) * | 2007-04-17 | 2008-12-04 | Cypress Semiconductor Corporation | Programmable sigma-delta analog-to-digital converter |
US20090066427A1 (en) * | 2005-02-04 | 2009-03-12 | Aaron Brennan | Poly-phase frequency synthesis oscillator |
US20090222647A1 (en) * | 2008-03-03 | 2009-09-03 | International Business Machines Corporation | Method and Apparatus for Reducing Test Case Generation Time in Processor Testing |
US20090307530A1 (en) * | 2008-06-10 | 2009-12-10 | Microsoft Corporation | Distributed testing system and techniques |
US20100192135A1 (en) * | 2003-02-14 | 2010-07-29 | Advantest Corporation | Method and Structure to Develop a Test Program for Semiconductor Integrated Circuits |
US20100205414A1 (en) * | 2009-02-11 | 2010-08-12 | Honeywell International Inc. | High integrity processor monitor |
US7825688B1 (en) | 2000-10-26 | 2010-11-02 | Cypress Semiconductor Corporation | Programmable microcontroller architecture(mixed analog/digital) |
US7844437B1 (en) | 2001-11-19 | 2010-11-30 | Cypress Semiconductor Corporation | System and method for performing next placements and pruning of disallowed placements for programming an integrated circuit |
US7870504B1 (en) | 2003-10-01 | 2011-01-11 | TestPlant Inc. | Method for monitoring a graphical user interface on a second computer display from a first computer |
US7893724B2 (en) | 2004-03-25 | 2011-02-22 | Cypress Semiconductor Corporation | Method and circuit for rapid alignment of signals |
US7908590B1 (en) * | 2006-03-02 | 2011-03-15 | Parasoft Corporation | System and method for automatically creating test cases through a remote client |
US7940706B2 (en) | 2001-10-01 | 2011-05-10 | International Business Machines Corporation | Controlling the state of duplexing of coupling facility structures |
US20110231822A1 (en) * | 2010-03-19 | 2011-09-22 | Jason Allen Sabin | Techniques for validating services for deployment in an intelligent workload management system |
US8026739B2 (en) | 2007-04-17 | 2011-09-27 | Cypress Semiconductor Corporation | System level interconnect with programmable switching |
CN102214139A (en) * | 2011-06-01 | 2011-10-12 | 北京航空航天大学 | Automatic test performance control and debugging method facing distributed system |
US8049569B1 (en) | 2007-09-05 | 2011-11-01 | Cypress Semiconductor Corporation | Circuit and method for improving the accuracy of a crystal-less oscillator having dual-frequency modes |
US8069428B1 (en) | 2001-10-24 | 2011-11-29 | Cypress Semiconductor Corporation | Techniques for generating microcontroller configuration information |
US8067948B2 (en) | 2006-03-27 | 2011-11-29 | Cypress Semiconductor Corporation | Input/output multiplexer bus |
US8078970B1 (en) | 2001-11-09 | 2011-12-13 | Cypress Semiconductor Corporation | Graphical user interface with user-selectable list-box |
US8078894B1 (en) | 2007-04-25 | 2011-12-13 | Cypress Semiconductor Corporation | Power management architecture, method and configuration system |
US8085067B1 (en) | 2005-12-21 | 2011-12-27 | Cypress Semiconductor Corporation | Differential-to-single ended signal converter circuit and method |
US8103497B1 (en) | 2002-03-28 | 2012-01-24 | Cypress Semiconductor Corporation | External interface for event architecture |
US8103496B1 (en) | 2000-10-26 | 2012-01-24 | Cypress Semicondutor Corporation | Breakpoint control in an in-circuit emulation system |
US8120408B1 (en) | 2005-05-05 | 2012-02-21 | Cypress Semiconductor Corporation | Voltage controlled oscillator delay cell and method |
US8130025B2 (en) | 2007-04-17 | 2012-03-06 | Cypress Semiconductor Corporation | Numerical band gap |
US8149048B1 (en) | 2000-10-26 | 2012-04-03 | Cypress Semiconductor Corporation | Apparatus and method for programmable power management in a programmable analog circuit block |
US8160864B1 (en) | 2000-10-26 | 2012-04-17 | Cypress Semiconductor Corporation | In-circuit emulator and pod synchronized boot |
US8176296B2 (en) | 2000-10-26 | 2012-05-08 | Cypress Semiconductor Corporation | Programmable microcontroller architecture |
US8219977B1 (en) * | 2007-01-26 | 2012-07-10 | Xilinx, Inc. | Using a software repository to increase the speed of software testing |
US20130117611A1 (en) * | 2011-11-09 | 2013-05-09 | Tata Consultancy Services Limited | Automated test execution plan derivation system and method |
US8499286B2 (en) * | 2010-07-27 | 2013-07-30 | Salesforce.Com, Inc. | Module testing adjustment and configuration |
US8499270B1 (en) | 2007-04-25 | 2013-07-30 | Cypress Semiconductor Corporation | Configuration of programmable IC design elements |
US8510716B1 (en) * | 2006-11-14 | 2013-08-13 | Parasoft Corporation | System and method for simultaneously validating a client/server application from the client side and from the server side |
US8533677B1 (en) | 2001-11-19 | 2013-09-10 | Cypress Semiconductor Corporation | Graphical user interface for dynamically reconfiguring a programmable device |
US20130238934A1 (en) * | 2012-03-08 | 2013-09-12 | Nec Corporation | Test method for distributed processing system and distributed processing system |
US8688491B1 (en) * | 2005-09-29 | 2014-04-01 | The Mathworks, Inc. | Testing and error reporting for on-demand software based marketing and sales |
US20140380278A1 (en) * | 2013-06-20 | 2014-12-25 | Nir Dayan | Automatic framework for parallel testing on multiple testing environments |
CN104331363A (en) * | 2014-10-17 | 2015-02-04 | 上海斐讯数据通信技术有限公司 | Automatic testing method for Android device |
US9189369B1 (en) * | 2013-03-11 | 2015-11-17 | Ca, Inc. | Systems, methods and computer program products for an automated test framework |
CN105117311A (en) * | 2015-08-07 | 2015-12-02 | 北京思特奇信息技术股份有限公司 | System deployment environment inspection method and system |
CN105279087A (en) * | 2015-10-21 | 2016-01-27 | 北京软件产品质量检测检验中心 | Test method and test system applied to test software |
US9448964B2 (en) | 2009-05-04 | 2016-09-20 | Cypress Semiconductor Corporation | Autonomous control in a programmable system |
US9471478B1 (en) | 2015-08-20 | 2016-10-18 | International Business Machines Corporation | Test machine management |
US9720805B1 (en) | 2007-04-25 | 2017-08-01 | Cypress Semiconductor Corporation | System and method for controlling a target device |
US9912547B1 (en) | 2015-10-23 | 2018-03-06 | Sprint Communications Company L.P. | Computer platform to collect, marshal, and normalize communication network data for use by a network operation center (NOC) management system |
US9928055B1 (en) * | 2015-10-23 | 2018-03-27 | Sprint Communications Company L.P. | Validating development software by comparing results from processing historic data sets |
US9959197B2 (en) * | 2015-08-31 | 2018-05-01 | Vmware, Inc. | Automated bug detection with virtual machine forking |
US10015089B1 (en) | 2016-04-26 | 2018-07-03 | Sprint Communications Company L.P. | Enhanced node B (eNB) backhaul network topology mapping |
CN108519943A (en) * | 2018-03-06 | 2018-09-11 | 平安科技(深圳)有限公司 | Test control and test execution device, method and computer storage medium |
WO2018200745A1 (en) * | 2017-04-25 | 2018-11-01 | Dennis Lin | Software functional testing |
US10268574B2 (en) * | 2016-09-01 | 2019-04-23 | Salesforce.Com, Inc. | Deployment testing for infrastructure delivery automation |
US10289465B2 (en) * | 2016-08-23 | 2019-05-14 | International Business Machines Corporation | Generating tailored error messages |
US10498625B1 (en) | 2016-10-14 | 2019-12-03 | Amazon Technologies, Inc. | Distributed testing service |
US10698662B2 (en) | 2001-11-15 | 2020-06-30 | Cypress Semiconductor Corporation | System providing automatic source code generation for personalization and parameterization of user modules |
US11080179B2 (en) | 2019-04-24 | 2021-08-03 | United States Of America As Represented By The Secretary Of The Navy | Device, system, and method for automatically detecting and repairing a bug in a computer program using a genetic algorithm |
US11237802B1 (en) | 2020-07-20 | 2022-02-01 | Bank Of America Corporation | Architecture diagram analysis tool for software development |
US11307962B2 (en) | 2018-07-09 | 2022-04-19 | United States Of America As Represented By The Secretary Of The Navy | Method for semantic preserving transform mutation discovery and vetting |
US11507494B2 (en) | 2016-02-10 | 2022-11-22 | Eggplant Limited | Method of, and apparatus for, testing computer hardware and software |
US11507496B2 (en) | 2016-02-10 | 2022-11-22 | Eggplant Limited | Method of, and apparatus for, testing computer hardware and software |
US20250045270A1 (en) * | 2023-08-03 | 2025-02-06 | The Toronto-Dominion Bank | Method and system for implementing a data corruption detection test |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4696003A (en) * | 1986-03-10 | 1987-09-22 | International Business Machines Corporation | System for testing interactive software |
US4710926A (en) * | 1985-12-27 | 1987-12-01 | American Telephone And Telegraph Company, At&T Bell Laboratories | Fault recovery in a distributed processing system |
US4797885A (en) * | 1985-12-11 | 1989-01-10 | Hitachi, Ltd. | Distributed processing system and method |
US4803683A (en) * | 1985-08-30 | 1989-02-07 | Hitachi, Ltd. | Method and apparatus for testing a distributed computer system |
US4953096A (en) * | 1986-08-15 | 1990-08-28 | Hitachi, Ltd. | Test method and apparatus for distributed system |
US5159600A (en) * | 1990-01-02 | 1992-10-27 | At&T Bell Laboratories | Arrangement for generating an optimal set of verification test cases |
US5271000A (en) * | 1991-03-22 | 1993-12-14 | International Business Machines Corporation | Method and apparatus for testing and evaluation of distributed networks |
-
1993
- 1993-03-26 US US08/037,219 patent/US5371883A/en not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4803683A (en) * | 1985-08-30 | 1989-02-07 | Hitachi, Ltd. | Method and apparatus for testing a distributed computer system |
US4797885A (en) * | 1985-12-11 | 1989-01-10 | Hitachi, Ltd. | Distributed processing system and method |
US4710926A (en) * | 1985-12-27 | 1987-12-01 | American Telephone And Telegraph Company, At&T Bell Laboratories | Fault recovery in a distributed processing system |
US4696003A (en) * | 1986-03-10 | 1987-09-22 | International Business Machines Corporation | System for testing interactive software |
US4953096A (en) * | 1986-08-15 | 1990-08-28 | Hitachi, Ltd. | Test method and apparatus for distributed system |
US5159600A (en) * | 1990-01-02 | 1992-10-27 | At&T Bell Laboratories | Arrangement for generating an optimal set of verification test cases |
US5271000A (en) * | 1991-03-22 | 1993-12-14 | International Business Machines Corporation | Method and apparatus for testing and evaluation of distributed networks |
Non-Patent Citations (4)
Title |
---|
Bogdan Korel `Automated Test Data Generation for Distributed Software` IEEE 1991 pp. 680-685. |
Bogdan Korel Automated Test Data Generation for Distributed Software IEEE 1991 pp. 680 685. * |
Carl K. Chang `Distributed Software Testing with Specification` IEEE 1990 pp. 112-117. |
Carl K. Chang Distributed Software Testing with Specification IEEE 1990 pp. 112 117. * |
Cited By (239)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5617534A (en) * | 1994-02-16 | 1997-04-01 | Intel Corporation | Interface protocol for testing of a cache memory |
US5598333A (en) * | 1994-02-22 | 1997-01-28 | Marsico, Jr.; Michael | Apparatus and method for electronically tracking and duplicating user input to an interactive electronic device |
US5673386A (en) * | 1994-06-29 | 1997-09-30 | U S West Technologies, Inc. | Method and system for identification of software application faults |
US5758156A (en) * | 1994-07-15 | 1998-05-26 | Fujitsu Limited | Method and apparatus of testing programs |
US5841960A (en) * | 1994-10-17 | 1998-11-24 | Fujitsu Limited | Method of and apparartus for automatically generating test program |
US5724273A (en) * | 1994-10-20 | 1998-03-03 | Tandem Computers Incorporated | Method and apparatus for analyzing test results obtained by applying a binary table and a suite of test scripts to a test subsystem control facility for a distributed systems network |
US5872909A (en) * | 1995-01-24 | 1999-02-16 | Wind River Systems, Inc. | Logic analyzer for software |
US5671351A (en) * | 1995-04-13 | 1997-09-23 | Texas Instruments Incorporated | System and method for automated testing and monitoring of software applications |
US5918004A (en) * | 1995-06-02 | 1999-06-29 | Rational Software Corporation | Remote monitoring of computer programs |
US7299455B2 (en) | 1995-06-02 | 2007-11-20 | Cisco Technology, Inc. | Remote monitoring of computer programs |
US20040153997A1 (en) * | 1995-06-02 | 2004-08-05 | International Business Machines Corporation | Remote monitoring of computer programs |
US20080147853A1 (en) * | 1995-06-02 | 2008-06-19 | Anderson Mark D | Remote monitoring of computer programs |
WO1996038733A1 (en) * | 1995-06-02 | 1996-12-05 | Pure Software, Inc. | Remote monitoring of computer programs |
US6263457B1 (en) * | 1995-06-02 | 2001-07-17 | Rational Software Corporation | Remote monitoring of computer programs |
US5909544A (en) * | 1995-08-23 | 1999-06-01 | Novell Inc. | Automated test harness |
US5867710A (en) * | 1995-09-05 | 1999-02-02 | Motorola, Inc. | Portable microkernel operating system verification and testing |
US5864660A (en) * | 1996-03-12 | 1999-01-26 | Electronic Data Systems Corporation | Testing the integration of a plurality of elements in a computer system using a plurality of tests codes, each corresponding to an alternate product configuration for an associated element |
US5764545A (en) * | 1996-03-29 | 1998-06-09 | Phase Metrics | Disk drive test sequence editor |
WO1997037289A1 (en) * | 1996-03-29 | 1997-10-09 | Phase Metrics | Disk drive test sequence editor |
US5854889A (en) * | 1996-06-26 | 1998-12-29 | Mci Worldcom, Inc. | Method and system for heterogeneous telecommunications network testing |
WO1997050034A1 (en) * | 1996-06-26 | 1997-12-31 | Mci Communications Corporation | Method and system for heterogeneous telecommunications network testing |
US5892949A (en) * | 1996-08-30 | 1999-04-06 | Schlumberger Technologies, Inc. | ATE test programming architecture |
US5799266A (en) * | 1996-09-19 | 1998-08-25 | Sun Microsystems, Inc. | Automatic generation of test drivers |
US5881219A (en) * | 1996-12-26 | 1999-03-09 | International Business Machines Corporation | Random reliability engine for testing distributed environments |
US5896494A (en) * | 1996-12-31 | 1999-04-20 | Compaq Computer Corporation | Diagnostic module dispatcher |
US6002868A (en) * | 1996-12-31 | 1999-12-14 | Compaq Computer Corporation | Test definition tool |
US5991897A (en) * | 1996-12-31 | 1999-11-23 | Compaq Computer Corporation | Diagnostic module dispatcher |
US5881221A (en) * | 1996-12-31 | 1999-03-09 | Compaq Computer Corporation | Driver level diagnostics |
US6263376B1 (en) * | 1997-02-24 | 2001-07-17 | Novell, Inc. | Generic run-time binding interpreter |
US5937154A (en) * | 1997-03-05 | 1999-08-10 | Hewlett-Packard Company | Manufacturing functional testing of computing devices using microprogram based functional tests applied via the devices own emulation debug port |
US6023773A (en) * | 1997-05-29 | 2000-02-08 | Advanced Micro Devices, Inc. | Multi-client test harness |
US6532401B2 (en) * | 1997-06-04 | 2003-03-11 | Nativeminds, Inc. | Methods for automatically verifying the performance of a virtual robot |
US5974567A (en) * | 1997-06-20 | 1999-10-26 | Compaq Computer Corporation | Ghost partition |
US6167537A (en) * | 1997-09-22 | 2000-12-26 | Hewlett-Packard Company | Communications protocol for an automated testing system |
US6393490B1 (en) * | 1997-12-18 | 2002-05-21 | Ian James Stiles | Method and system for a programmatic feedback process for end-user support |
US6023586A (en) * | 1998-02-10 | 2000-02-08 | Novell, Inc. | Integrity verifying and correcting software |
US7039912B1 (en) * | 1998-05-12 | 2006-05-02 | Apple Computer, Inc. | Integrated computer testing and task management systems |
US7343587B2 (en) | 1998-05-12 | 2008-03-11 | Apple Inc. | System for creating, managing and executing computer testing and task management applications |
US6182245B1 (en) * | 1998-08-31 | 2001-01-30 | Lsi Logic Corporation | Software test case client/server system and method |
US9860315B2 (en) | 1998-09-10 | 2018-01-02 | International Business Machines Corporation | Controlling the state of duplexing of coupling facility structures |
US9253046B2 (en) | 1998-09-10 | 2016-02-02 | International Business Machines Corporation | Controlling the state of duplexing of coupling facility structures |
US9565013B2 (en) | 1998-09-10 | 2017-02-07 | International Business Machines Corporation | Controlling the state of duplexing of coupling facility structures |
US6425096B1 (en) * | 1998-12-31 | 2002-07-23 | Worldcom, Inc. | Method and system for audio portion test management in a client/server testing architecture |
US6510402B1 (en) * | 1999-02-04 | 2003-01-21 | International Business Machines Corporation | Component testing with a client system in an integrated test environment network |
US6574578B1 (en) | 1999-02-04 | 2003-06-03 | International Business Machines Corporation | Server system for coordinating utilization of an integrated test environment for component testing |
US6601018B1 (en) | 1999-02-04 | 2003-07-29 | International Business Machines Corporation | Automatic test framework system and method in software component testing |
US7127506B1 (en) | 1999-05-28 | 2006-10-24 | Teradyne, Inc. | PC configuration fault analysis |
US6654914B1 (en) * | 1999-05-28 | 2003-11-25 | Teradyne, Inc. | Network fault isolation |
US6934934B1 (en) | 1999-08-30 | 2005-08-23 | Empirix Inc. | Method and system for software object testing |
US6859922B1 (en) * | 1999-08-30 | 2005-02-22 | Empirix Inc. | Method of providing software testing services |
US6281894B1 (en) | 1999-08-31 | 2001-08-28 | Everdream, Inc. | Method and apparatus for configuring a hard disk and for providing support for a computer system |
US20010015977A1 (en) * | 1999-10-08 | 2001-08-23 | Stefan Johansson | Selective reception |
US20010014085A1 (en) * | 1999-10-08 | 2001-08-16 | Microsoft Corporation | Originator authentication |
US6775824B1 (en) | 2000-01-12 | 2004-08-10 | Empirix Inc. | Method and system for software object testing |
US20010052089A1 (en) * | 2000-04-27 | 2001-12-13 | Microsoft Corporation | Automated testing |
US6643802B1 (en) * | 2000-04-27 | 2003-11-04 | Ncr Corporation | Coordinated multinode dump collection in response to a fault |
US6804796B2 (en) * | 2000-04-27 | 2004-10-12 | Microsoft Corporation | Method and test tool for verifying the functionality of a software based unit |
US7095718B1 (en) * | 2000-04-29 | 2006-08-22 | Hewlett-Packard Development Company, L.P. | Client/server scan software architecture |
US8930937B2 (en) | 2000-05-25 | 2015-01-06 | Dell Marketing L.P. | Intelligent patch checker |
US7853943B2 (en) | 2000-05-25 | 2010-12-14 | Dell Marketing Usa, L.P. | Intelligent patch checker |
US8141071B2 (en) | 2000-05-25 | 2012-03-20 | Dell Marketing Usa, L.P. | Intelligent patch checker |
US7171660B2 (en) | 2000-05-25 | 2007-01-30 | Everdream Corporation | Intelligent patch checker |
US20050193386A1 (en) * | 2000-05-25 | 2005-09-01 | Everdream Corporation | Intelligent patch checker |
US6751794B1 (en) | 2000-05-25 | 2004-06-15 | Everdream Corporation | Intelligent patch checker |
US6779134B1 (en) * | 2000-06-27 | 2004-08-17 | Ati International Srl | Software test system and method |
US7240336B1 (en) | 2000-07-25 | 2007-07-03 | Sci Systems, Inc. | Interpretive simulation of software download process |
US20020042897A1 (en) * | 2000-09-29 | 2002-04-11 | Tanisys Technology Inc. | Method and system for distributed testing of electronic devices |
US6892328B2 (en) * | 2000-09-29 | 2005-05-10 | Tanisys Technology, Inc. | Method and system for distributed testing of electronic devices |
US9843327B1 (en) | 2000-10-26 | 2017-12-12 | Cypress Semiconductor Corporation | PSOC architecture |
US8149048B1 (en) | 2000-10-26 | 2012-04-03 | Cypress Semiconductor Corporation | Apparatus and method for programmable power management in a programmable analog circuit block |
US10020810B2 (en) | 2000-10-26 | 2018-07-10 | Cypress Semiconductor Corporation | PSoC architecture |
US10248604B2 (en) | 2000-10-26 | 2019-04-02 | Cypress Semiconductor Corporation | Microcontroller programmable system on a chip |
US10261932B2 (en) | 2000-10-26 | 2019-04-16 | Cypress Semiconductor Corporation | Microcontroller programmable system on a chip |
US8160864B1 (en) | 2000-10-26 | 2012-04-17 | Cypress Semiconductor Corporation | In-circuit emulator and pod synchronized boot |
US8103496B1 (en) | 2000-10-26 | 2012-01-24 | Cypress Semicondutor Corporation | Breakpoint control in an in-circuit emulation system |
US8555032B2 (en) | 2000-10-26 | 2013-10-08 | Cypress Semiconductor Corporation | Microcontroller programmable system on a chip with programmable interconnect |
US8176296B2 (en) | 2000-10-26 | 2012-05-08 | Cypress Semiconductor Corporation | Programmable microcontroller architecture |
US8736303B2 (en) | 2000-10-26 | 2014-05-27 | Cypress Semiconductor Corporation | PSOC architecture |
US9766650B2 (en) | 2000-10-26 | 2017-09-19 | Cypress Semiconductor Corporation | Microcontroller programmable system on a chip with programmable interconnect |
US10725954B2 (en) | 2000-10-26 | 2020-07-28 | Monterey Research, Llc | Microcontroller programmable system on a chip |
US8358150B1 (en) | 2000-10-26 | 2013-01-22 | Cypress Semiconductor Corporation | Programmable microcontroller architecture(mixed analog/digital) |
US7825688B1 (en) | 2000-10-26 | 2010-11-02 | Cypress Semiconductor Corporation | Programmable microcontroller architecture(mixed analog/digital) |
US20020089968A1 (en) * | 2001-01-03 | 2002-07-11 | Hans Johansson | Method of inquiring |
US7440439B2 (en) | 2001-01-03 | 2008-10-21 | Microsoft Corporation | Method of inquiring |
US20030014510A1 (en) * | 2001-07-11 | 2003-01-16 | Sun Microsystems, Inc. | Distributed processing framework system |
US20030120776A1 (en) * | 2001-07-11 | 2003-06-26 | Sun Microsystems, Inc. | System controller for use in a distributed processing framework system and methods for implementing the same |
US7114159B2 (en) | 2001-07-11 | 2006-09-26 | Sun Microsystems, Inc. | Processing resource for use in a distributed processing framework system and methods for implementing the same |
US6961937B2 (en) | 2001-07-11 | 2005-11-01 | Sun Microsystems, Inc. | Registry service for use in a distributed processing framework system and methods for implementing the same |
US7426729B2 (en) * | 2001-07-11 | 2008-09-16 | Sun Microsystems, Inc. | Distributed processing framework system |
US6986125B2 (en) | 2001-08-01 | 2006-01-10 | International Business Machines Corporation | Method and apparatus for testing and evaluating a software component using an abstraction matrix |
US20030037314A1 (en) * | 2001-08-01 | 2003-02-20 | International Business Machines Corporation | Method and apparatus for testing and evaluating a software component using an abstraction matrix |
US7610625B2 (en) * | 2001-08-07 | 2009-10-27 | Nec Corporation | Program control system, program control method and information control program |
US20030031153A1 (en) * | 2001-08-07 | 2003-02-13 | Nec Corporation | Program control system, program control method and information control program |
US7165256B2 (en) * | 2001-09-11 | 2007-01-16 | Sun Microsystems, Inc. | Task grouping in a distributed processing framework system and methods for implementing the same |
US20030120700A1 (en) * | 2001-09-11 | 2003-06-26 | Sun Microsystems, Inc. | Task grouping in a distributed processing framework system and methods for implementing the same |
US7243137B2 (en) | 2001-09-28 | 2007-07-10 | Sun Microsystems, Inc. | Remote system controller and data center and methods for implementing the same |
US6910158B2 (en) * | 2001-10-01 | 2005-06-21 | International Business Machines Corporation | Test tool and methods for facilitating testing of duplexed computer functions |
US20030065981A1 (en) * | 2001-10-01 | 2003-04-03 | International Business Machines Corporation | Test tool and methods for testing a system-managed duplexed structure |
US10491675B2 (en) | 2001-10-01 | 2019-11-26 | International Business Machines Corporation | Controlling the state of duplexing of coupling facility structures |
US8341188B2 (en) | 2001-10-01 | 2012-12-25 | International Business Machines Corporation | Controlling the state of duplexing of coupling facility structures |
US20050273659A1 (en) * | 2001-10-01 | 2005-12-08 | International Business Machines Corporation | Test tool and methods for facilitating testing of a system managed event |
US6915455B2 (en) * | 2001-10-01 | 2005-07-05 | International Business Machines Corporation | Test tool and methods for testing a system-managed duplexed structure |
US7940706B2 (en) | 2001-10-01 | 2011-05-10 | International Business Machines Corporation | Controlling the state of duplexing of coupling facility structures |
US8069428B1 (en) | 2001-10-24 | 2011-11-29 | Cypress Semiconductor Corporation | Techniques for generating microcontroller configuration information |
US10466980B2 (en) | 2001-10-24 | 2019-11-05 | Cypress Semiconductor Corporation | Techniques for generating microcontroller configuration information |
US8793635B1 (en) | 2001-10-24 | 2014-07-29 | Cypress Semiconductor Corporation | Techniques for generating microcontroller configuration information |
US8078970B1 (en) | 2001-11-09 | 2011-12-13 | Cypress Semiconductor Corporation | Graphical user interface with user-selectable list-box |
US10698662B2 (en) | 2001-11-15 | 2020-06-30 | Cypress Semiconductor Corporation | System providing automatic source code generation for personalization and parameterization of user modules |
US8370791B2 (en) | 2001-11-19 | 2013-02-05 | Cypress Semiconductor Corporation | System and method for performing next placements and pruning of disallowed placements for programming an integrated circuit |
US8533677B1 (en) | 2001-11-19 | 2013-09-10 | Cypress Semiconductor Corporation | Graphical user interface for dynamically reconfiguring a programmable device |
US7844437B1 (en) | 2001-11-19 | 2010-11-30 | Cypress Semiconductor Corporation | System and method for performing next placements and pruning of disallowed placements for programming an integrated circuit |
US7055137B2 (en) | 2001-11-29 | 2006-05-30 | I2 Technologies Us, Inc. | Distributed automated software graphical user interface (GUI) testing |
US20030098879A1 (en) * | 2001-11-29 | 2003-05-29 | I2 Technologies Us, Inc. | Distributed automated software graphical user interface (GUI) testing |
US8103497B1 (en) | 2002-03-28 | 2012-01-24 | Cypress Semiconductor Corporation | External interface for event architecture |
US7308608B1 (en) * | 2002-05-01 | 2007-12-11 | Cypress Semiconductor Corporation | Reconfigurable testing system and method |
US8402313B1 (en) | 2002-05-01 | 2013-03-19 | Cypress Semiconductor Corporation | Reconfigurable testing system and method |
US6983400B2 (en) * | 2002-05-16 | 2006-01-03 | Sun Microsystems Inc. | Distributed test harness model |
US20030217308A1 (en) * | 2002-05-16 | 2003-11-20 | Sun Microsystems, Inc. | Distributed test harness model |
US8255198B2 (en) * | 2003-02-14 | 2012-08-28 | Advantest Corporation | Method and structure to develop a test program for semiconductor integrated circuits |
US20100192135A1 (en) * | 2003-02-14 | 2010-07-29 | Advantest Corporation | Method and Structure to Develop a Test Program for Semiconductor Integrated Circuits |
US20040205725A1 (en) * | 2003-04-14 | 2004-10-14 | Lambert John Robert | Automatic determination of invalid call sequences in software components |
US7216337B2 (en) * | 2003-04-14 | 2007-05-08 | Microsoft Corporation | Automatic determination of invalid call sequences in software components |
US7281165B2 (en) * | 2003-06-12 | 2007-10-09 | Inventec Corporation | System and method for performing product tests utilizing a single storage device |
US20040255201A1 (en) * | 2003-06-12 | 2004-12-16 | Win-Harn Liu | System and method for performing product tests utilizing a single storage device |
US7401259B2 (en) * | 2003-06-19 | 2008-07-15 | Sun Microsystems, Inc. | System and method for scenario generation in a distributed system |
US20040260982A1 (en) * | 2003-06-19 | 2004-12-23 | Sun Microsystems, Inc. | System and method for scenario generation in a distributed system |
US20050015767A1 (en) * | 2003-07-01 | 2005-01-20 | Brian Nash | Operating system configuration tool |
US20050015675A1 (en) * | 2003-07-03 | 2005-01-20 | Kolawa Adam K. | Method and system for automatic error prevention for computer software |
US7596778B2 (en) * | 2003-07-03 | 2009-09-29 | Parasoft Corporation | Method and system for automatic error prevention for computer software |
US20050066233A1 (en) * | 2003-09-19 | 2005-03-24 | International Business Machines Corporation | Method, apparatus and computer program product for implementing autonomic testing and verification of software fix programs |
US7296189B2 (en) * | 2003-09-19 | 2007-11-13 | International Business Machines Corporation | Method, apparatus and computer program product for implementing autonomic testing and verification of software fix programs |
US9477567B2 (en) | 2003-10-01 | 2016-10-25 | Testplant, Inc. | Method for monitoring a graphical user interface on a second computer display from a first computer |
US9658931B2 (en) | 2003-10-01 | 2017-05-23 | TestPlant Inc. | Method for monitoring a graphical user interface on a second computer display from a first computer |
US7870504B1 (en) | 2003-10-01 | 2011-01-11 | TestPlant Inc. | Method for monitoring a graphical user interface on a second computer display from a first computer |
US20080109681A1 (en) * | 2003-11-26 | 2008-05-08 | International Business Machines Corporation | Apparatus for Adaptive Problem Determination in Distributed Service-Based Applications |
US7893724B2 (en) | 2004-03-25 | 2011-02-22 | Cypress Semiconductor Corporation | Method and circuit for rapid alignment of signals |
US7559087B2 (en) | 2004-12-10 | 2009-07-07 | Microsoft Corporation | Token generation method and apparatus |
US20060130131A1 (en) * | 2004-12-10 | 2006-06-15 | Microsoft Corporation | Token generation method and apparatus |
US8085100B2 (en) | 2005-02-04 | 2011-12-27 | Cypress Semiconductor Corporation | Poly-phase frequency synthesis oscillator |
US20090066427A1 (en) * | 2005-02-04 | 2009-03-12 | Aaron Brennan | Poly-phase frequency synthesis oscillator |
US8120408B1 (en) | 2005-05-05 | 2012-02-21 | Cypress Semiconductor Corporation | Voltage controlled oscillator delay cell and method |
US7886193B2 (en) * | 2005-05-09 | 2011-02-08 | Microsoft Corporation | System and methods for processing software authorization and error feedback |
US20060253760A1 (en) * | 2005-05-09 | 2006-11-09 | Microsoft Corporation | System and methods for processing software authorization and error feedback |
US7558986B2 (en) * | 2005-05-26 | 2009-07-07 | United Parcel Service Of America, Inc. | Software process monitor |
US20060271918A1 (en) * | 2005-05-26 | 2006-11-30 | United Parcel Service Of America, Inc. | Software process monitor |
US8332826B2 (en) | 2005-05-26 | 2012-12-11 | United Parcel Service Of America, Inc. | Software process monitor |
US7823021B2 (en) | 2005-05-26 | 2010-10-26 | United Parcel Service Of America, Inc. | Software process monitor |
US20060271916A1 (en) * | 2005-05-26 | 2006-11-30 | United Parcel Service Of America, Inc. | Software process monitor |
US20060271205A1 (en) * | 2005-05-26 | 2006-11-30 | United Parcel Service Of America, Inc. | Software process monitor |
US20060294434A1 (en) * | 2005-06-28 | 2006-12-28 | Fujitsu Limited | Test recording method and device, and computer-readable recording medium storing test recording program |
US7890932B2 (en) * | 2005-06-28 | 2011-02-15 | Fujitsu Limited | Test recording method and device, and computer-readable recording medium storing test recording program |
US8688491B1 (en) * | 2005-09-29 | 2014-04-01 | The Mathworks, Inc. | Testing and error reporting for on-demand software based marketing and sales |
US20070168734A1 (en) * | 2005-11-17 | 2007-07-19 | Phil Vasile | Apparatus, system, and method for persistent testing with progressive environment sterilzation |
US20070168733A1 (en) * | 2005-12-09 | 2007-07-19 | Devins Robert J | Method and system of coherent design verification of inter-cluster interactions |
US7849362B2 (en) * | 2005-12-09 | 2010-12-07 | International Business Machines Corporation | Method and system of coherent design verification of inter-cluster interactions |
US8085067B1 (en) | 2005-12-21 | 2011-12-27 | Cypress Semiconductor Corporation | Differential-to-single ended signal converter circuit and method |
US7908590B1 (en) * | 2006-03-02 | 2011-03-15 | Parasoft Corporation | System and method for automatically creating test cases through a remote client |
US20070214391A1 (en) * | 2006-03-10 | 2007-09-13 | International Business Machines Corporation | Method and apparatus for testing software |
US8850393B2 (en) | 2006-03-10 | 2014-09-30 | International Business Machines Corporation | Method and apparatus for testing software |
US20080229284A1 (en) * | 2006-03-10 | 2008-09-18 | International Business Machines Corporation | Method and Apparatus for Testing Software |
US8067948B2 (en) | 2006-03-27 | 2011-11-29 | Cypress Semiconductor Corporation | Input/output multiplexer bus |
US8717042B1 (en) | 2006-03-27 | 2014-05-06 | Cypress Semiconductor Corporation | Input/output multiplexer bus |
US7945416B2 (en) | 2006-04-12 | 2011-05-17 | Ati Technologies, Ulc | Software or hardware test apparatus and method |
US20070244663A1 (en) * | 2006-04-12 | 2007-10-18 | Ati Technologies Inc. | Software or hardware test apparatus and method |
US7930595B2 (en) * | 2006-06-22 | 2011-04-19 | International Business Machines Corporation | Method and apparatus for analyzing error conditions in a massively parallel computer system by identifying anomalous nodes within a communicator set |
US20080022261A1 (en) * | 2006-06-22 | 2008-01-24 | Thomas Michael Gooding | Method and Apparatus for Analyzing Error Conditions in a Massively Parallel Computer System by Identifying Anomalous Nodes Within a Communicator Set |
US20080155350A1 (en) * | 2006-09-29 | 2008-06-26 | Ventsislav Ivanov | Enabling tracing operations in clusters of servers |
US7941789B2 (en) | 2006-09-29 | 2011-05-10 | Sap Ag | Common performance trace mechanism |
US8037458B2 (en) | 2006-09-29 | 2011-10-11 | Sap Ag | Method and system for providing a common structure for trace data |
US8028200B2 (en) * | 2006-09-29 | 2011-09-27 | Sap Ag | Tracing operations in multiple computer systems |
US7979850B2 (en) | 2006-09-29 | 2011-07-12 | Sap Ag | Method and system for generating a common trace data format |
US7954011B2 (en) * | 2006-09-29 | 2011-05-31 | Sap Ag | Enabling tracing operations in clusters of servers |
US20080127110A1 (en) * | 2006-09-29 | 2008-05-29 | Sap Ag | Method and system for generating a common trace data format |
US20080155348A1 (en) * | 2006-09-29 | 2008-06-26 | Ventsislav Ivanov | Tracing operations in multiple computer systems |
US20080098358A1 (en) * | 2006-09-29 | 2008-04-24 | Sap Ag | Method and system for providing a common structure for trace data |
US20080127108A1 (en) * | 2006-09-29 | 2008-05-29 | Sap Ag | Common performance trace mechanism |
US8510716B1 (en) * | 2006-11-14 | 2013-08-13 | Parasoft Corporation | System and method for simultaneously validating a client/server application from the client side and from the server side |
US8219977B1 (en) * | 2007-01-26 | 2012-07-10 | Xilinx, Inc. | Using a software repository to increase the speed of software testing |
US20080244233A1 (en) * | 2007-03-26 | 2008-10-02 | Microsoft Corporation | Machine cluster topology representation for automated testing |
US7827273B2 (en) * | 2007-03-26 | 2010-11-02 | Microsoft Corporation | Machine cluster topology representation for automated testing |
US7743281B2 (en) * | 2007-04-13 | 2010-06-22 | Microsoft Corporation | Distributed file fuzzing |
US20080256340A1 (en) * | 2007-04-13 | 2008-10-16 | Microsoft Corporation | Distributed File Fuzzing |
US20080297388A1 (en) * | 2007-04-17 | 2008-12-04 | Cypress Semiconductor Corporation | Programmable sigma-delta analog-to-digital converter |
US20080259998A1 (en) * | 2007-04-17 | 2008-10-23 | Cypress Semiconductor Corp. | Temperature sensor with digital bandgap |
US8130025B2 (en) | 2007-04-17 | 2012-03-06 | Cypress Semiconductor Corporation | Numerical band gap |
US8026739B2 (en) | 2007-04-17 | 2011-09-27 | Cypress Semiconductor Corporation | System level interconnect with programmable switching |
US8092083B2 (en) | 2007-04-17 | 2012-01-10 | Cypress Semiconductor Corporation | Temperature sensor with digital bandgap |
US8476928B1 (en) | 2007-04-17 | 2013-07-02 | Cypress Semiconductor Corporation | System level interconnect with programmable switching |
US8040266B2 (en) | 2007-04-17 | 2011-10-18 | Cypress Semiconductor Corporation | Programmable sigma-delta analog-to-digital converter |
US8909960B1 (en) | 2007-04-25 | 2014-12-09 | Cypress Semiconductor Corporation | Power management architecture, method and configuration system |
US8078894B1 (en) | 2007-04-25 | 2011-12-13 | Cypress Semiconductor Corporation | Power management architecture, method and configuration system |
US9720805B1 (en) | 2007-04-25 | 2017-08-01 | Cypress Semiconductor Corporation | System and method for controlling a target device |
US8499270B1 (en) | 2007-04-25 | 2013-07-30 | Cypress Semiconductor Corporation | Configuration of programmable IC design elements |
US8049569B1 (en) | 2007-09-05 | 2011-11-01 | Cypress Semiconductor Corporation | Circuit and method for improving the accuracy of a crystal-less oscillator having dual-frequency modes |
US7836343B2 (en) * | 2008-03-03 | 2010-11-16 | International Business Machines Corporation | Method and apparatus for reducing test case generation time in processor testing |
US20090222647A1 (en) * | 2008-03-03 | 2009-09-03 | International Business Machines Corporation | Method and Apparatus for Reducing Test Case Generation Time in Processor Testing |
US20090307530A1 (en) * | 2008-06-10 | 2009-12-10 | Microsoft Corporation | Distributed testing system and techniques |
US7827438B2 (en) | 2008-06-10 | 2010-11-02 | Microsoft Corporation | Distributed testing system and techniques |
US20100205414A1 (en) * | 2009-02-11 | 2010-08-12 | Honeywell International Inc. | High integrity processor monitor |
US8352795B2 (en) | 2009-02-11 | 2013-01-08 | Honeywell International Inc. | High integrity processor monitor |
US9448964B2 (en) | 2009-05-04 | 2016-09-20 | Cypress Semiconductor Corporation | Autonomous control in a programmable system |
US20110231822A1 (en) * | 2010-03-19 | 2011-09-22 | Jason Allen Sabin | Techniques for validating services for deployment in an intelligent workload management system |
US9317407B2 (en) * | 2010-03-19 | 2016-04-19 | Novell, Inc. | Techniques for validating services for deployment in an intelligent workload management system |
US8499286B2 (en) * | 2010-07-27 | 2013-07-30 | Salesforce.Com, Inc. | Module testing adjustment and configuration |
CN102214139B (en) * | 2011-06-01 | 2013-11-27 | 北京航空航天大学 | An Execution Control and Scheduling Method for Distributed System-Oriented Automated Testing |
CN102214139A (en) * | 2011-06-01 | 2011-10-12 | 北京航空航天大学 | Automatic test performance control and debugging method facing distributed system |
US20130117611A1 (en) * | 2011-11-09 | 2013-05-09 | Tata Consultancy Services Limited | Automated test execution plan derivation system and method |
US9378120B2 (en) * | 2011-11-09 | 2016-06-28 | Tata Consultancy Services Limited | Automated test execution plan derivation system and method |
US20130238934A1 (en) * | 2012-03-08 | 2013-09-12 | Nec Corporation | Test method for distributed processing system and distributed processing system |
US9058313B2 (en) * | 2012-03-08 | 2015-06-16 | Nec Corporation | Test method for distributed processing system and distributed processing system |
US9189369B1 (en) * | 2013-03-11 | 2015-11-17 | Ca, Inc. | Systems, methods and computer program products for an automated test framework |
US20140380278A1 (en) * | 2013-06-20 | 2014-12-25 | Nir Dayan | Automatic framework for parallel testing on multiple testing environments |
US9021438B2 (en) * | 2013-06-20 | 2015-04-28 | Sap Portals Israel Ltd | Automatic framework for parallel testing on multiple testing environments |
CN104331363A (en) * | 2014-10-17 | 2015-02-04 | 上海斐讯数据通信技术有限公司 | Automatic testing method for Android device |
CN105117311A (en) * | 2015-08-07 | 2015-12-02 | 北京思特奇信息技术股份有限公司 | System deployment environment inspection method and system |
US9658946B2 (en) | 2015-08-20 | 2017-05-23 | International Business Machines Corporation | Test machine management |
US9471478B1 (en) | 2015-08-20 | 2016-10-18 | International Business Machines Corporation | Test machine management |
US9886371B2 (en) | 2015-08-20 | 2018-02-06 | International Business Machines Corporation | Test machine management |
US9563526B1 (en) | 2015-08-20 | 2017-02-07 | International Business Machines Corporation | Test machine management |
US9501389B1 (en) | 2015-08-20 | 2016-11-22 | International Business Machines Corporation | Test machine management |
US9959197B2 (en) * | 2015-08-31 | 2018-05-01 | Vmware, Inc. | Automated bug detection with virtual machine forking |
CN105279087B (en) * | 2015-10-21 | 2018-02-06 | 北京软件产品质量检测检验中心 | Apply method of testing and test system in test software |
CN105279087A (en) * | 2015-10-21 | 2016-01-27 | 北京软件产品质量检测检验中心 | Test method and test system applied to test software |
US9928055B1 (en) * | 2015-10-23 | 2018-03-27 | Sprint Communications Company L.P. | Validating development software by comparing results from processing historic data sets |
US9912547B1 (en) | 2015-10-23 | 2018-03-06 | Sprint Communications Company L.P. | Computer platform to collect, marshal, and normalize communication network data for use by a network operation center (NOC) management system |
US11507496B2 (en) | 2016-02-10 | 2022-11-22 | Eggplant Limited | Method of, and apparatus for, testing computer hardware and software |
US11507494B2 (en) | 2016-02-10 | 2022-11-22 | Eggplant Limited | Method of, and apparatus for, testing computer hardware and software |
US10015089B1 (en) | 2016-04-26 | 2018-07-03 | Sprint Communications Company L.P. | Enhanced node B (eNB) backhaul network topology mapping |
US10289465B2 (en) * | 2016-08-23 | 2019-05-14 | International Business Machines Corporation | Generating tailored error messages |
US10268574B2 (en) * | 2016-09-01 | 2019-04-23 | Salesforce.Com, Inc. | Deployment testing for infrastructure delivery automation |
US10498625B1 (en) | 2016-10-14 | 2019-12-03 | Amazon Technologies, Inc. | Distributed testing service |
US11159411B2 (en) | 2016-10-14 | 2021-10-26 | Amazon Technologies, Inc. | Distributed testing service |
US10248543B2 (en) | 2017-04-25 | 2019-04-02 | Dennis Lin | Software functional testing |
WO2018200745A1 (en) * | 2017-04-25 | 2018-11-01 | Dennis Lin | Software functional testing |
CN108519943A (en) * | 2018-03-06 | 2018-09-11 | 平安科技(深圳)有限公司 | Test control and test execution device, method and computer storage medium |
US11307962B2 (en) | 2018-07-09 | 2022-04-19 | United States Of America As Represented By The Secretary Of The Navy | Method for semantic preserving transform mutation discovery and vetting |
US11080179B2 (en) | 2019-04-24 | 2021-08-03 | United States Of America As Represented By The Secretary Of The Navy | Device, system, and method for automatically detecting and repairing a bug in a computer program using a genetic algorithm |
US11237802B1 (en) | 2020-07-20 | 2022-02-01 | Bank Of America Corporation | Architecture diagram analysis tool for software development |
US20250045270A1 (en) * | 2023-08-03 | 2025-02-06 | The Toronto-Dominion Bank | Method and system for implementing a data corruption detection test |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5371883A (en) | Method of testing programs in a distributed environment | |
US6385765B1 (en) | Specification and verification for concurrent systems with graphical and textual editors | |
US5754760A (en) | Automatic software testing tool | |
EP1504347B1 (en) | Automated software testing system and method | |
US6141630A (en) | System and method for automated design verification | |
US5870539A (en) | Method for generalized windows application install testing for use with an automated test tool | |
US7117484B2 (en) | Recursive use of model based test generation for middleware validation | |
US4595981A (en) | Method of testing interfaces between computer program modules | |
Tsai et al. | Scenario-based functional regression testing | |
US9348731B2 (en) | Tracing the execution path of a computer program | |
US6698012B1 (en) | Method and system for testing behavior of procedures | |
US6182245B1 (en) | Software test case client/server system and method | |
US20130326486A1 (en) | Keyword based software testing system and method | |
US6487704B1 (en) | System and method for identifying finite state machines and verifying circuit designs | |
WO2000002123A1 (en) | Method for defining durable data for regression testing | |
US7831864B1 (en) | Persistent context-based behavior injection or testing of a computing system | |
Lei et al. | A combinatorial testing strategy for concurrent programs | |
Kranzlmüller et al. | NOPE: A nondeterministic program evaluator | |
KR100267762B1 (en) | Software test system | |
CN109669868A (en) | The method and system of software test | |
US7720827B2 (en) | Network meta-data libraries and related methods | |
Abdalla et al. | Test case prioritization for mobile apps | |
Gargantini et al. | ViBBA: A Toolbox for Automatic Model Driven Animation. | |
Du Bousquet et al. | Specification-based testing of synchronous software | |
Yen et al. | Test data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INTERNATIONAL BUSINESS MACHINES CORP., NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST.;ASSIGNORS:GROSS, KIMBERLY L.;SULLIVAN, KIRK D.;REEL/FRAME:006482/0861;SIGNING DATES FROM 19930310 TO 19930311 |
|
REMI | Maintenance fee reminder mailed | ||
FPAY | Fee payment |
Year of fee payment: 4 |
|
SULP | Surcharge for late payment | ||
REMI | Maintenance fee reminder mailed | ||
LAPS | Lapse for failure to pay maintenance fees | ||
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Expired due to failure to pay maintenance fee |
Effective date: 20021206 |