DBSTalk Forum banner
Status
Not open for further replies.
1 - 2 of 2 Posts

·
Legend
Joined
·
212 Posts
Discussion Starter · #1 ·
It seems to me that there a a few software engineers on this forum, like myself, that use Test Cases to methodically test software using a list of tests, expected results and actual results reported.

Maybe D* or someone else could provide a test scenario something like this:

Test Step Expected Result Actual Result
(If failed,provide info)

1 Press Guide on Remote Guide Menu Pops Up Menu Popped Up
2 Select All Channels Guide Shows All Channels All Channels Appeared
3 Schedule HD OTA Record Recording Works Blew Up. Only 30 mins recorded.

This way there would be some structured testing and not a bunch of random chance errors showing up. These are good, but for feedback to the programmers at D* this would go a long way.

Sorry about the formatting. There should be columns for each heading, but you should get the gist of it.
 

·
Legend
Joined
·
212 Posts
Discussion Starter · #5 ·
laxcoach said:
I think there are two problems with this scenario:

1 - I bet D* has test cases that they are running and must pass, and they are probably more detailed than we will create.

2 - Most of the bugs that people encounter are not reproducible with a step by step process. The ones that are get fixed in 1 or 2 releases. If they were reproducible, D* would create test cases to fix them.
I see a lot of "I think I did this, or Not sure what I did". This would at least give more sturcture to what exactly was done when something blows up or doesn't work.

I wouldn't expect the same detailed test cases that D* runs, but for us putting the HR20 through it's "Normal Expected Functionality", it would go a long way.
 
1 - 2 of 2 Posts
Status
Not open for further replies.
Top