Master deployment (#67)
* BETTERZON-58: Basic Functionality with scrapy (#33)
* BETTERZON-73: Adding API endpoint that returns the lowest non-amazon prices for a given list of product ids (#32)
* BETTERZON-75: User registration API endpoint (#34)
* BETTERZON-75: Adding backend functions to enable user registration
* BETTERZON-75: Adding regex to check email and username
* BETTERZON-83: FE unit testing (#35)
* BETTERZON-83: Making pre-generated unit tests work
* BETTERZON-83: Writing unit tests for angular to improve code coverage
* BETTERZON-79: Adding API endpoint for logging in (#36)
* BETTERZON-84: Adding service method to check if a session is valid (#37)
* BETTERZON-77: Changing error behavior as the previous behavior cloud have opened up security vulnerabilities (#38)
* BETTERZON-76: Adding method descriptions for backend service methods (#40)
* Adding Codacy code quality badge to README
* BETTERZON-89: Refactoring / Reformatting and adding unit tests (#41)
* BETTERZON-90: Adding API endpoint for creating price alarms (#42)
* BETTERZON-91: Adding API endpoint to GET all price alarms for the currently logged in user (#43)
* BETTERZON-92: Adding API endpoint to edit (update) price alarms (#44)
* BETTERZON-99: Adding some basic cucumber tests (#45)
* BETTERZON-100: Switching to cookies for session management (#46)
* BETTERZON-100: Switching session handling to cookies
* BETTERZON-100: Some code reformatting
* BETTERZON-100: Some more code reformatting
* BETTERZON-93: Adding API endpoint to get managed shops (#47)
* BETTERZON-94: Adding API endpoint to deactivate price listings as a vendor manager (#48)
* BETTERZON-97: Adding API endpoint to get all products listed by a specific vendor (#50)
* BETTERZON-98: Adding API endpoint for adding price entries as a registered vendor manager (#51)
* BETTERZON-95: Adding API endpoint for getting, inserting and updating contact persons (#52)
* BETTERZON-58 (#53)
* BETTERZON-58: Basic Functionality with scrapy
* Added independent crawler function, yielding price
* moved logic to amazon.py
* .
* moved scrapy files to unused folder
* Added basic amazon crawler using beautifulsoup4
* Connected Api to Crawler
* Fixed string concatenation for sql statement in getProductLinksForProduct
* BETTERZON-58: Fixing SQL insert
* BETTERZON-58: Adding access key verification
* BETTERZON-58: Fixing API endpoint of the crawler
- The list of products in the API request was treated like a string and henceforth, only the first product has been crawled
* Added another selector for price on amazon (does not work for books)
Co-authored-by: root <root@DESKTOP-ARBPL82.localdomain>
Co-authored-by: Patrick Müller <patrick@mueller-patrick.tech>
Co-authored-by: Patrick <50352812+Mueller-Patrick@users.noreply.github.com>
* BETTERZON-96: Adding API endpoint for delisting a whole vendor (#54)
* BETTERZON-101: Adding service functions for pricealarms api (#55)
- Not properly tested though as login functionality is required to test but not yet implemented
* BETTERZON-110: Refactoring, reformatting and commenting api service (#56)
* BETTERZON-107: Refactoring code with Proxy as design pattern (#49)
* BETTERZON-78 (#39)
* BETTERZON-31, dependencies.
* BETTERZON-31: Fixing dependencies
* BETTERZON-31,
BETTERZON-50
info popover and footer had been changed.
* BETTERZON-74
simple top-bar has been created.
* WIP: creating footer using grid.
* BETTERZON-78 adding bottom bar and top bar
* Adding cookieconsent as dependency again since it was removed by a merge
* Adding cookieconsent as dependency again since it was removed by a merge
* Apply suggestions from code review
Switching from single to double quotes
* BETTERZON-78 - grid added, structured as in Adobe XD mockup
Co-authored-by: Patrick Müller <patrick@mueller-patrick.tech>
Co-authored-by: Patrick <50352812+Mueller-Patrick@users.noreply.github.com>
* BETTERZON-109 (#57)
* BETTERZON-31, dependencies.
* BETTERZON-31: Fixing dependencies
* BETTERZON-31,
BETTERZON-50
info popover and footer had been changed.
* BETTERZON-74
simple top-bar has been created.
* WIP: creating footer using grid.
* BETTERZON-78 adding bottom bar and top bar
* Adding cookieconsent as dependency again since it was removed by a merge
* Adding cookieconsent as dependency again since it was removed by a merge
* Apply suggestions from code review
Switching from single to double quotes
* BETTERZON-78 - grid added, structured as in Adobe XD mockup
* wip: component rewritten, simple grid applied.
* wip: new component created and added to the app.module.ts. Added a minimal grid layout.
* wip: all components were wrapped now. Grid structure has been applied to the main wrapper-class "container".
* wip: component created and added to the app.module.ts
Co-authored-by: Patrick Müller <patrick@mueller-patrick.tech>
Co-authored-by: Patrick <50352812+Mueller-Patrick@users.noreply.github.com>
* BETTERZON-108 (#58)
* BETTERZON-31, dependencies.
* BETTERZON-31: Fixing dependencies
* BETTERZON-31,
BETTERZON-50
info popover and footer had been changed.
* BETTERZON-74
simple top-bar has been created.
* WIP: creating footer using grid.
* BETTERZON-78 adding bottom bar and top bar
* Adding cookieconsent as dependency again since it was removed by a merge
* Adding cookieconsent as dependency again since it was removed by a merge
* Apply suggestions from code review
Switching from single to double quotes
* BETTERZON-78 - grid added, structured as in Adobe XD mockup
* wip: component rewritten, simple grid applied.
* wip: new component created and added to the app.module.ts. Added a minimal grid layout.
* wip: all components were wrapped now. Grid structure has been applied to the main wrapper-class "container".
Co-authored-by: Patrick Müller <patrick@mueller-patrick.tech>
Co-authored-by: Patrick <50352812+Mueller-Patrick@users.noreply.github.com>
* BETTERZON-106 (#59)
* BETTERZON-31, dependencies.
* BETTERZON-31: Fixing dependencies
* BETTERZON-31,
BETTERZON-50
info popover and footer had been changed.
* BETTERZON-74
simple top-bar has been created.
* WIP: creating footer using grid.
* BETTERZON-78 adding bottom bar and top bar
* Adding cookieconsent as dependency again since it was removed by a merge
* Adding cookieconsent as dependency again since it was removed by a merge
* Apply suggestions from code review
Switching from single to double quotes
* BETTERZON-78 - grid added, structured as in Adobe XD mockup
* wip: component rewritten, simple grid applied.
* wip: new component created and added to the app.module.ts. Added a minimal grid layout.
Co-authored-by: Patrick Müller <patrick@mueller-patrick.tech>
Co-authored-by: Patrick <50352812+Mueller-Patrick@users.noreply.github.com>
* BETTEZON-102 (#60)
* BETTERZON-31, dependencies.
* BETTERZON-31: Fixing dependencies
* BETTERZON-31,
BETTERZON-50
info popover and footer had been changed.
* BETTERZON-74
simple top-bar has been created.
* WIP: creating footer using grid.
* BETTERZON-78 adding bottom bar and top bar
* Adding cookieconsent as dependency again since it was removed by a merge
* Adding cookieconsent as dependency again since it was removed by a merge
* Apply suggestions from code review
Switching from single to double quotes
* BETTERZON-78 - grid added, structured as in Adobe XD mockup
* wip: component rewritten, simple grid applied.
Co-authored-by: Patrick Müller <patrick@mueller-patrick.tech>
Co-authored-by: Patrick <50352812+Mueller-Patrick@users.noreply.github.com>
* BETTERZON-113, BETTERZON-114, BETTERZON-115: Adding API endpoint for favorite shops (#61)
* BETTERZON-116: Adding API endpoint for searching a new product (#62)
* BETTERZON-117: Adding API endpoint for getting the latest crawling status (#63)
* BETTERZON-111: Adding service functions for login and registration (#64)
* BETTERZON-112: Adding service functions for managing vendor shops (#65)
* BETTERZON-118: Adding service functions for managing favorite shops (#66)
Co-authored-by: henningxtro <sextro.henning@student.dhbw-karlsruhe.de>
Co-authored-by: root <root@DESKTOP-ARBPL82.localdomain>
Co-authored-by: Reboooooorn <61185041+Reboooooorn@users.noreply.github.com>
2021-05-29 08:58:27 +00:00
|
|
|
import sql
|
|
|
|
import requests
|
|
|
|
from bs4 import BeautifulSoup
|
|
|
|
|
|
|
|
HEADERS = ({'User-Agent':
|
|
|
|
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.138 '
|
|
|
|
'Safari/537.36'})
|
|
|
|
|
|
|
|
|
|
|
|
def crawl(product_ids: [int]) -> dict:
|
|
|
|
"""
|
|
|
|
Crawls the given list of products and saves the results to sql
|
|
|
|
:param products: The list of product IDs to fetch
|
|
|
|
:return: A dict with the following fields:
|
|
|
|
total_crawls: number of total crawl tries (products * vendors per product)
|
|
|
|
successful_crawls: number of successful products
|
|
|
|
products_with_problems: list of products that have not been crawled successfully
|
|
|
|
"""
|
|
|
|
total_crawls = 0
|
|
|
|
successful_crawls = 0
|
|
|
|
products_with_problems = []
|
|
|
|
|
|
|
|
# Iterate over every product that has to be crawled
|
|
|
|
for product_id in product_ids:
|
|
|
|
# Get all links for this product
|
|
|
|
product_links = sql.getProductLinksForProduct(product_id)
|
|
|
|
|
|
|
|
crawled_data = []
|
|
|
|
|
|
|
|
# Iterate over every link / vendor
|
|
|
|
for product_vendor_info in product_links:
|
|
|
|
total_crawls += 1
|
|
|
|
|
|
|
|
# Call the appropriate vendor crawling function and append the result to the list of crawled data
|
|
|
|
if product_vendor_info['vendor_id'] == 1:
|
|
|
|
# Amazon
|
|
|
|
data = __crawl_amazon__(product_vendor_info)
|
|
|
|
if data:
|
|
|
|
crawled_data.append(data)
|
|
|
|
elif product_vendor_info['vendor_id'] == 2:
|
|
|
|
# Apple
|
|
|
|
data = __crawl_apple__(product_vendor_info)
|
|
|
|
if data:
|
|
|
|
crawled_data.append(data)
|
|
|
|
elif product_vendor_info['vendor_id'] == 3:
|
|
|
|
# Media Markt
|
|
|
|
data = __crawl_mediamarkt__(product_vendor_info)
|
|
|
|
if data:
|
|
|
|
crawled_data.append(data)
|
|
|
|
else:
|
|
|
|
products_with_problems.append(product_vendor_info)
|
|
|
|
continue
|
|
|
|
|
|
|
|
successful_crawls += 1
|
|
|
|
|
|
|
|
# Insert data to SQL
|
|
|
|
sql.insertData(crawled_data)
|
|
|
|
|
|
|
|
return {
|
|
|
|
'total_crawls': total_crawls,
|
|
|
|
'successful_crawls': successful_crawls,
|
|
|
|
'products_with_problems': products_with_problems
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
|
|
def __crawl_amazon__(product_info: dict) -> tuple:
|
|
|
|
"""
|
|
|
|
Crawls the price for the given product from amazon
|
|
|
|
:param product_info: A dict with product info containing product_id, vendor_id, url
|
|
|
|
:return: A tuple with the crawled data, containing (product_id, vendor_id, price_in_cents)
|
|
|
|
"""
|
|
|
|
page = requests.get(product_info['url'], headers=HEADERS)
|
|
|
|
soup = BeautifulSoup(page.content, features="lxml")
|
|
|
|
try:
|
|
|
|
price = int(
|
|
|
|
soup.find(id='priceblock_ourprice').get_text().replace(".", "").replace(",", "").replace("€", "").strip())
|
|
|
|
if not price:
|
|
|
|
price = int(soup.find(id='price_inside_buybox').get_text().replace(".", "").replace(",", "").replace("€", "").strip())
|
|
|
|
|
|
|
|
except RuntimeError:
|
|
|
|
price = -1
|
|
|
|
except AttributeError:
|
|
|
|
price = -1
|
|
|
|
|
|
|
|
if price != -1:
|
|
|
|
return (product_info['product_id'], product_info['vendor_id'], price)
|
|
|
|
else:
|
|
|
|
return None
|
|
|
|
|
|
|
|
|
|
|
|
def __crawl_apple__(product_info: dict) -> tuple:
|
|
|
|
"""
|
|
|
|
Crawls the price for the given product from apple
|
|
|
|
:param product_info: A dict with product info containing product_id, vendor_id, url
|
|
|
|
:return: A tuple with the crawled data, containing (product_id, vendor_id, price_in_cents)
|
|
|
|
"""
|
|
|
|
# return (product_info['product_id'], product_info['vendor_id'], 123)
|
|
|
|
pass
|
|
|
|
|
|
|
|
|
|
|
|
def __crawl_mediamarkt__(product_info: dict) -> tuple:
|
|
|
|
"""
|
|
|
|
Crawls the price for the given product from media markt
|
|
|
|
:param product_info: A dict with product info containing product_id, vendor_id, url
|
|
|
|
:return: A tuple with the crawled data, containing (product_id, vendor_id, price_in_cents)
|
|
|
|
"""
|
|
|
|
pass
|