Development of SDL-Based Software for an Embedded System Practical Experiences Autors: Stefan Bläsius, Josef Maier, Stefan Karg, Günter Koler Speaker: Günter Koler www.tenovis.com
Contents Target platform and its restrictions starting wit SDL-88 (in-ouse developed tool) development process and quality ensurance using SDL & MSC ost test and target feedback target testing wit Telelogic s Microtester Development experiences using SDL Conclusion
Target Platform System Integral 3 LAN µp µp µp... intelligent unit (wit µp 8018x) IOM-2-Bus + µp-bus......... non-intelligent unit supporting S-, T- or analogue-interfaces
Target Platform SDL-Tasks and Target Restrictions Layer 3 CTI DECT DECT ROM <1 MB RAM <½ MB Mobility- Control MMI
Starting wit SDL-88 In-ouse developed SDL-88 tool Tenovis SDL GR Tenovis SDL-88 Standard SDL PR PR C SUN Workstation Tenovis List List Tenovis Code Generator
Starting wit SDL-88 Drawbacks of our in-ouse SDL tool inter-process communication not supported process instantiation not supported insufficient tool cain => 1997: turn to Telelogic s SDT
development process and quality ensurance using SDL & MSC Block A Cannel x Block B Task 1 Task 2 Task 3 Cannel y Block C Task 4 Requirements, System-Design
development process and quality ensurance using SDL & MSC Quality-ensurance TEST CASES Re-design Host-Test interactive (Debugging wit TUSSI) SDL Host-Test automatic (Regression) MSC Requirements, System-Design results, traces Target- Test Microtester Emulator System-Debugger Pre-tested system
Specifying Test-Cases by Using MSC ~HTML~ Test-case : IUT_EX01_testcase_example Responsible : Stefan Karg / Günter Koler In tis example te syntax and semantics of te test-case designed by MSC are demonstrated. ~HTML~ MSC IUT_EX01_testcase_example L2 IUT L3 IUT_MOD01_startup Transmit_req data_ind (DATA 0x01 0x02 0x03) (SNR 0x01) EXPECT_ERROR E01_TRANSMIT_ERROR Neg_ack (CAUSE 0x93, SNR 0x01) T_NO_ANSWER EXPECT_ERROR E01_TRANSMIT_ERROR Data_ind (SNR 0x01) pos_ack (SNR 0x01) transmit_ack SIM_CMD list-process -
Regression Testing MSC specify te beaviour of te system conversion commandfile for testing IUT + testing setup inputs and conversion reference result expected reactions sall reference (intelligent) comparison IS report ( active, HTML )
Test Report
Feedback from Target Testing TEST CASES Host-Test interactive (Debugging wit TUSSI) Host-Test automatic (Regression) Target System conversion Target- Test results, traces Microtester Emulator System-Debugger
SDL Integration into te Target Ligt integration of SDL-Systems into te RTOS SDL process A SDL process B SDL process C OS task C-basic C-advanced C-micro library xinenv xoutenv RTOS OS task OS task
On Target Testing wit te Microtester Distribution of te microtester parts target side ost A ost B Lauterbac - Gateway as communication link between te ost and target: reducing development efforts no extra communication mecanism. no additional interface ardware on te PBX-boards. target ICE probe target soft-ware ICE ICE ICE software GUI TCP/ IP microtester gate way micro tester software GUI
On Target Testing wit te Microtester GUI of te microtester (Screensot) textual trace user interface
On Target Testing wit te Microtester Advantages: microtester offers features for debugging usually only available in a development environment setting breakpoints at SDL-level drawing MSC-diagrams out of te target.
On Target Testing wit te Microtester Restrictions due to our target situation grapical SDL trace causes a memory overflow. record and play mode cannot be used because of te multiprocessor system because not all tasks are designed wit SDT recordings are always incomplete as not all events are visible to te microtester. different code generators and kernels on te ost and target different sceduling between C-micro on te target and te C-basic on te ost original task-loop of te C-micro-kernel as to be modified different data structures on ost (SUN-Sparc) and target (Intel) different byte alignment and byte order leads to message coding (container) internal data structures of C-micro code and te C-basic code are completely different message coding & conditional compiling increased target software wit microtester different mapping may ave effects on debugging
Memory Usage ROM ROM SDL wit te microtester ROM SDL witout microtester CMICRO-KERNEL 10% CMICRO-MT 4% CMICRO-MT 0% SDL-SYSTEM SDL-INIT-DATA-ROM 1% DEBUG-STRINGS 2% CMICRO-KERNEL 9% 44% C-Module 19% FREE for Microtester 28% SDL-SYSTEM 64% SDL-INIT-DATA-ROM 1% C-Module DEBUG-STRINGS 0% 18%
Memory Usage RAM RAM SDL wit te microtester RAM SDL witout microtester CMICRO-MT 21% CMICRO-KERNEL 4% CMICRO-KERNEL 4% C-Module 4% CMICRO-MT 0% SDL-SYSTEM 71% FREE for Microtester 22% C-Module 3% SDL-SYSTEM 71%
Run Time Influence of te microtester on run time run time 0 50 100 150 200 250 300 t [ms] run time witout microtester 28,810 wit microtester (best 275,980 case)
Development effort: SDL / C Effort SDL contra C 120 100 100 100 100 100 100 80 75 percent 60 50 50 40 20 0 Design Implementation Testing Extensions SDL-Implementation C-Implementation
Errors in SDL and C Implementations Errors in SDL and C-Implementation 20 18 18 16 16 14 12 errors 10 8 C errors total C-Implementation SDL-Implementation 6 4 2 0 1998 2 SDL
Development Speed wit SDT Development Speed, 1st and 2nd project wit SDT 4,5 4 4 3,5 3 3 3 3 monts 2,5 2 1,5 1,5 1 0,75 0,75 0,5 0,5 0 DECT (1.) CTI (2.) design & specification implementation ost test target test
Te Rigt Tool for te Rigt Job ++ Use SDL for protocol andling ++ Use SDL for state macines -- Don t use SDL for software parts wit loops -- Don t use SDL for small data bases
Conclusion SDT tool cain + privat adaptations = a igly capable environment igly paralleled software development integration and regression testing trougout nearly te wole process automated documentation, testing and report generation measurable increase in development speed and product quality
SAM 2000 Josef Maier Günter Koler