mirror of
https://github.com/Mueller-Patrick/Betterzon.git
synced 2024-11-22 14:23:57 +00:00
Small adjustments to SRS
parent
67224defcc
commit
1d77ebf342
32
SRS.md
32
SRS.md
|
@ -86,6 +86,7 @@
|
|||
|
||||
### 1.1 Purpose
|
||||
This Software Requirment Specification (SRS) describes all specifications for the application "Betterzon". It includes an overview about this project and its vision, detailled information about the planned features and boundary conditions of the development process.
|
||||
|
||||
### 1.2 Scope
|
||||
**The project is going to be realized as a Web Application. Planned functions include:**
|
||||
- Searching for products, that are also listed on Amazon
|
||||
|
@ -132,7 +133,7 @@ So we need to make sure, that we have a very good useability.
|
|||
### 3.1 Functionality
|
||||
|
||||
#### 3.1.1 Functionality - Frontend
|
||||
The Userinterface should be intuitive and appealing.
|
||||
The User interface should be intuitive and appealing. It should be lightweight and not overloaded with features so anyone is easily able to use our service.
|
||||
|
||||
#### 3.1.2 Functionality – Backend
|
||||
The backend should read data from the database and serve it to the frontend.
|
||||
|
@ -141,48 +142,65 @@ The backend should read data from the database and serve it to the frontend.
|
|||
The database stores the findings from the webcrawler. Also a history is stored.
|
||||
|
||||
#### 3.1.4 Functionality – Webcrawler
|
||||
The webcrawler crawls a predefined set of website to exctract for a predefined list of products.
|
||||
The webcrawler crawls a predefined set of websites to exctract for a predefined list of products.
|
||||
|
||||
|
||||
#### 3.2.1 User system
|
||||
|
||||
### 3.3 Usability
|
||||
We aim for excellent usability.
|
||||
We aim for excellent usability. Every person that is able to use websites like amazon should be able to use our website.
|
||||
|
||||
### 3.4 Reliability
|
||||
TBD
|
||||
|
||||
#### 3.4.1 Availability
|
||||
99.642%
|
||||
|
||||
### 3.5 Performance
|
||||
TBD
|
||||
The website should be as performant as most modern websites are, so there should not be any waiting times >1s. The web crawler will run every night so performance won't be an issue here. The website will be usable while the crawler is running as it is just adding history points to the database.
|
||||
|
||||
#### 3.5.1 Response time
|
||||
Response time should be very low, on par with other modern websites.
|
||||
Max. 50ms
|
||||
|
||||
#### 3.5.2 Throughput
|
||||
The user traffic should not exceed 100 MBit/s.
|
||||
|
||||
#### 3.5.3 Capacity
|
||||
The size of the database should not exceed 100GB in the first iteration.
|
||||
|
||||
#### 3.5.4 Resource utilization
|
||||
TBD
|
||||
We plan to run the service on 2 vServers, one of which runs the webserver and database and both of them running the web crawler.
|
||||
|
||||
### 3.6 Supportability
|
||||
TBD
|
||||
|
||||
### 3.7 Design Constraints
|
||||
TBD
|
||||
|
||||
#### 3.7.1 Development tools
|
||||
IntelliJ Ultimate
|
||||
GitHub
|
||||
Jenkins
|
||||
|
||||
#### 3.7.4 Supported Platforms
|
||||
All platforms that can run a recent browser.
|
||||
|
||||
### 3.8 Online User Documentation and Help System Requirements
|
||||
TBD
|
||||
The user documentation will be a part of this project documentation and will therefore also be hosted in this wiki.
|
||||
|
||||
### 3.9 Purchased Components
|
||||
TBD
|
||||
|
||||
### 3.10 Licensing Requirements
|
||||
TBD
|
||||
|
||||
### 3.11 Legal, Copyright and other Notices
|
||||
TBD
|
||||
|
||||
### 3.12 Applicable Standards
|
||||
TBD
|
||||
We will follow standard code conventions for TypeScript and Python. A more detailed description of naming conventions etc. will be published as a seperate article in this wiki.
|
||||
|
||||
## 4. Supporting Information
|
||||
TBD
|
||||
|
||||
|
|
Loading…
Reference in New Issue
Block a user