Updated SRS (markdown)

Patrick 2021-06-10 14:33:33 +02:00
parent a870b8ebce
commit a39282ae17

24
SRS.md

@ -113,7 +113,7 @@ The website consists of four main components:
### 2.3 User characteristics ### 2.3 User characteristics
We aim for the same kind of users that amazon has today. We aim for the same kind of users that amazon has today.
So we need to make sure, that we have a very good useability. So we need to make sure, that we have a very good usability.
## 3. Specific Requirements ## 3. Specific Requirements
@ -126,7 +126,7 @@ The backend should read data from the database and serve it to the frontend.
The database stores the findings from the webcrawler. Also a history is stored. The database stores the findings from the webcrawler. Also a history is stored.
#### 3.1.3 Functionality Webcrawler #### 3.1.3 Functionality Webcrawler
The webcrawler crawls a predefined set of websites to exctract for a predefined list of products. The webcrawler crawls a predefined set of websites to extract for a predefined list of products.
### 3.2 Functionality - User Interface ### 3.2 Functionality - User Interface
@ -137,10 +137,10 @@ The User interface should be intuitive and appealing. It should be lightweight a
We aim for excellent usability. Every person that is able to use websites like amazon should be able to use our website. We aim for excellent usability. Every person that is able to use websites like amazon should be able to use our website.
### 3.4 Reliability ### 3.4 Reliability
TBD We can of course not guarantee that the prices we crawl are correct or up-to-date, so we can give no warranty regarding this.
#### 3.4.1 Availability #### 3.4.1 Availability
99.642% 99.2016%
### 3.5 Performance ### 3.5 Performance
The website should be as performant as most modern websites are, so there should not be any waiting times >1s. The web crawler will run every night so performance won't be an issue here. The website will be usable while the crawler is running as it is just adding history points to the database. The website should be as performant as most modern websites are, so there should not be any waiting times >1s. The web crawler will run every night so performance won't be an issue here. The website will be usable while the crawler is running as it is just adding history points to the database.
@ -150,19 +150,19 @@ Response time should be very low, on par with other modern websites.
Max. 50ms Max. 50ms
#### 3.5.2 Throughput #### 3.5.2 Throughput
The user traffic should not exceed 100 MBit/s. The user traffic should not exceed 100 MBit/s. As we load pictures from a different server, the throughput is split, improving the performance for all users.
#### 3.5.3 Capacity #### 3.5.3 Capacity
The size of the database should not exceed 100GB in the first iteration. The size of the database should not exceed 100GB in the first iteration.
#### 3.5.4 Resource utilization #### 3.5.4 Resource utilization
We plan to run the service on 2 vServers, one of which runs the webserver and database and both of them running the web crawler. We plan to run the service on 2 vServers, one of which runs the webserver and database and both of them running the web crawler. The crawler will also be implemented in a way that it can easily be run on more servers if needed.
### 3.6 Supportability ### 3.6 Supportability
TBD The service will be online as long as the domain belongs to us. We cannot guarantee that it will be online for a long time after the final presentation as this would imply costs for both the domain and the servers.
### 3.7 Design Constraints ### 3.7 Design Constraints
TBD Standard Angular- and ExpressJS patterns will be followed.
#### 3.7.1 Development tools #### 3.7.1 Development tools
IntelliJ Ultimate IntelliJ Ultimate
@ -176,17 +176,17 @@ All platforms that can run a recent browser.
The user documentation will be a part of this project documentation and will therefore also be hosted in this wiki. The user documentation will be a part of this project documentation and will therefore also be hosted in this wiki.
### 3.9 Purchased Components ### 3.9 Purchased Components
TBD As we only use open-source tools like mariaDB and Jenkins, the only purchased components will be the domain and the servers.
### 3.10 Licensing Requirements ### 3.10 Licensing Requirements
TBD The project is licensed under the MIT License.
### 3.11 Legal, Copyright and other Notices ### 3.11 Legal, Copyright and other Notices
TBD As stated in the license, everyone is free to use the project, given that they include our copyright notice in their product.
### 3.12 Applicable Standards ### 3.12 Applicable Standards
We will follow standard code conventions for TypeScript and Python. A more detailed description of naming conventions etc. will be published as a seperate article in this wiki. We will follow standard code conventions for TypeScript and Python. A more detailed description of naming conventions etc. will be published as a seperate article in this wiki.
## 4. Supporting Information ## 4. Supporting Information
TBD N/A