Compare commits
87 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
f3746ff756 | ||
|
|
9f16d37c8b | ||
|
|
8a13c7940a | ||
|
|
8bea76ff67 | ||
|
|
1504bd744c | ||
|
|
6449f10615 | ||
|
|
d79509bda7 | ||
|
|
630b847466 | ||
|
|
ed11611919 | ||
|
|
e2431c938d | ||
|
|
60f4b4a5ed | ||
|
|
d41097af20 | ||
|
|
8a5d505731 | ||
|
|
36e76c6f41 | ||
|
|
717b9421dd | ||
|
|
d2f71e8c94 | ||
|
|
697991f28f | ||
|
|
b0e18ab766 | ||
|
|
e39a6921d0 | ||
|
|
aac1be0565 | ||
|
|
683fcb2138 | ||
|
|
9fbbef9b18 | ||
|
|
6e0b9a0a7b | ||
|
|
7f472f6f4f | ||
|
|
b7d7b33ab9 | ||
|
|
da11c0bb1f | ||
|
|
eae433d2bd | ||
|
|
c16bc37aff | ||
|
|
255b06ac9e | ||
|
|
29ec619126 | ||
|
|
247def04ff | ||
|
|
4600e7d953 | ||
|
|
850c266555 | ||
|
|
ad374fe2fb | ||
|
|
5ca39b6fe7 | ||
|
|
b50dd26e6f | ||
|
|
53eaccaa9b | ||
|
|
91f207316a | ||
|
|
1e37418909 | ||
|
|
4c09ba5529 | ||
|
|
7bab4747ad | ||
|
|
fd8cc7378c | ||
|
|
8aeef4d5e7 | ||
|
|
4bafde9da7 | ||
|
|
5a3107aecf | ||
|
|
7e758720f0 | ||
|
|
39e3e249f8 | ||
|
|
118c5b056e | ||
|
|
2c3b5599fe | ||
|
|
e421eaa324 | ||
|
|
75d6bc6808 | ||
|
|
98c547e416 | ||
|
|
45250e36e4 | ||
|
|
fa7544d052 | ||
|
|
53f3fc5ee9 | ||
|
|
1b36de4131 | ||
|
|
6f0c6f6284 | ||
|
|
b7dda5bf87 | ||
|
|
14f33a40c3 | ||
|
|
5c904aced0 | ||
|
|
53a3bbf531 | ||
|
|
50586f1ce7 | ||
|
|
9f6235a0fc | ||
|
|
4d21f150ce | ||
|
|
7c0dfc49dd | ||
|
|
269b13f6c1 | ||
|
|
a9bb7d2e5a | ||
|
|
11295f27a7 | ||
|
|
55aa3dd85b | ||
|
|
20272d4360 | ||
|
|
623dc92ef2 | ||
|
|
2d59394b1e | ||
|
|
26c2095ff1 | ||
|
|
ec7d241caa | ||
|
|
d0432ed1aa | ||
|
|
8c5503d002 | ||
|
|
6d6f950c95 | ||
|
|
30745e54ba | ||
|
|
c3fd94e79e | ||
|
|
2924a8d67b | ||
|
|
9f4c4bb9cf | ||
|
|
3d6eebf06e | ||
|
|
b3d9b6ff7e | ||
|
|
60facacc48 | ||
|
|
b8a6063838 | ||
|
|
bcba2be524 | ||
|
|
f7187d2017 |
@@ -1,5 +1,5 @@
|
||||
# pyasic
|
||||
*A set of modules for interfacing with many common types of ASIC bitcoin miners, using both their API and SSH.*
|
||||
*A simplified and standardized interface for Bitcoin ASICs.*
|
||||
|
||||
[](https://github.com/psf/black)
|
||||
[](https://pypi.org/project/pyasic/)
|
||||
|
||||
@@ -263,7 +263,7 @@ Pyasic implements a few dataclasses as helpers to make data return types consist
|
||||
|
||||
[`MinerData`][pyasic.data.MinerData] is a return from the [`get_data()`](#get-data) function, and is used to have a consistent dataset across all returns.
|
||||
|
||||
You can call [`MinerData.asdict()`][pyasic.data.MinerData.asdict] to get the dataclass as a dictionary, and there are many other helper functions contained in the class to convert to different data formats.
|
||||
You can call [`MinerData.as_dict()`][pyasic.data.MinerData.as_dict] to get the dataclass as a dictionary, and there are many other helper functions contained in the class to convert to different data formats.
|
||||
|
||||
[`MinerData`][pyasic.data.MinerData] instances can also be added to each other to combine their data and can be divided by a number to divide all their data, allowing you to get average data from many miners by doing -
|
||||
```python
|
||||
@@ -288,3 +288,27 @@ It is the return from [`get_config()`](#get-config).
|
||||
|
||||
Each miner has a unique way to convert the [`MinerConfig`][pyasic.config.MinerConfig] to their specific type, there are helper functions in the class.
|
||||
In most cases these helper functions should not be used, as [`send_config()`](#send-config) takes a [`MinerConfig`][pyasic.config.MinerConfig] and will do the conversion to the right type for you.
|
||||
|
||||
## Settings
|
||||
`pyasic` has settings designed to make using large groups of miners easier. You can set the default password for all types of miners using the [`pyasic.settings`][pyasic.settings] module, used as follows:
|
||||
|
||||
```python
|
||||
from pyasic import settings
|
||||
|
||||
settings.update("default_antminer_password", "my_pwd")
|
||||
```
|
||||
|
||||
Here are of all the settings, and their default values:
|
||||
```
|
||||
"network_ping_retries": 1,
|
||||
"network_ping_timeout": 3,
|
||||
"network_scan_threads": 300,
|
||||
"factory_get_retries": 1,
|
||||
"get_data_retries": 1,
|
||||
"default_whatsminer_password": "admin",
|
||||
"default_innosilicon_password": "admin",
|
||||
"default_antminer_password": "root",
|
||||
"default_bosminer_password": "root",
|
||||
"default_vnish_password": "admin",
|
||||
"default_goldshell_password": "123456789",
|
||||
```
|
||||
|
||||
@@ -1,6 +1,14 @@
|
||||
# pyasic
|
||||
## Miner Factory
|
||||
|
||||
[`MinerFactory`][pyasic.miners.miner_factory.MinerFactory] is the way to create miner types in `pyasic`. The most important method is [`get_miner()`][pyasic.get_miner], which is mapped to [`pyasic.get_miner()`][pyasic.get_miner], and should be used from there.
|
||||
|
||||
The instance used for [`pyasic.get_miner()`][pyasic.get_miner] is `pyasic.miner_factory`.
|
||||
|
||||
[`MinerFactory`][pyasic.MinerFactory] also keeps a cache, which can be cleared if needed with `pyasic.miner_factory.clear_cached_miners()`.
|
||||
|
||||
Finally, there is functionality to get multiple miners without using `asyncio.gather()` explicitly. Use `pyasic.miner_factory.get_multiple_miners()` with a list of IPs as strings to get a list of miner instances. You can also get multiple miners with an `AsyncGenerator` by using `pyasic.miner_factory.get_miner_generator()`.
|
||||
|
||||
::: pyasic.miners.miner_factory.MinerFactory
|
||||
handler: python
|
||||
options:
|
||||
|
||||
23
docs/settings/settings.md
Normal file
23
docs/settings/settings.md
Normal file
@@ -0,0 +1,23 @@
|
||||
# pyasic
|
||||
## settings
|
||||
|
||||
::: pyasic.settings
|
||||
handler: python
|
||||
options:
|
||||
show_root_heading: false
|
||||
heading_level: 4
|
||||
|
||||
|
||||
### get
|
||||
::: pyasic.settings.get
|
||||
handler: python
|
||||
options:
|
||||
show_root_heading: false
|
||||
heading_level: 4
|
||||
|
||||
### update
|
||||
::: pyasic.settings.update
|
||||
handler: python
|
||||
options:
|
||||
show_root_heading: false
|
||||
heading_level: 4
|
||||
@@ -53,7 +53,8 @@ nav:
|
||||
- Goldshell X5: "miners/goldshell/X5.md"
|
||||
- Goldshell XMax: "miners/goldshell/XMax.md"
|
||||
- Base Miner: "miners/base_miner.md"
|
||||
|
||||
- Settings:
|
||||
- Settings: "settings/settings.md"
|
||||
|
||||
plugins:
|
||||
- mkdocstrings
|
||||
|
||||
@@ -20,7 +20,7 @@ import json
|
||||
import logging
|
||||
import re
|
||||
import warnings
|
||||
from typing import Tuple, Union
|
||||
from typing import Union
|
||||
|
||||
from pyasic.errors import APIError, APIWarning
|
||||
|
||||
@@ -83,15 +83,15 @@ class BaseMinerAPI:
|
||||
data = self._load_api_data(data)
|
||||
|
||||
# check for if the user wants to allow errors to return
|
||||
if not ignore_errors:
|
||||
# validate the command succeeded
|
||||
validation = self._validate_command_output(data)
|
||||
if not validation[0]:
|
||||
if allow_warning:
|
||||
logging.warning(
|
||||
f"{self.ip}: API Command Error: {command}: {validation[1]}"
|
||||
)
|
||||
validation = self._validate_command_output(data)
|
||||
if not validation[0]:
|
||||
if not ignore_errors:
|
||||
# validate the command succeeded
|
||||
raise APIError(validation[1])
|
||||
if allow_warning:
|
||||
logging.warning(
|
||||
f"{self.ip}: API Command Error: {command}: {validation[1]}"
|
||||
)
|
||||
|
||||
logging.debug(f"{self} - (Send Command) - Received data.")
|
||||
return data
|
||||
@@ -118,11 +118,12 @@ class BaseMinerAPI:
|
||||
data = await self.send_command(command, allow_warning=allow_warning)
|
||||
except APIError as e:
|
||||
# try to identify the error
|
||||
if ":" in e.message:
|
||||
err_command = e.message.split(":")[0]
|
||||
if err_command in commands:
|
||||
commands.remove(err_command)
|
||||
continue
|
||||
if e.message is not None:
|
||||
if ":" in e.message:
|
||||
err_command = e.message.split(":")[0]
|
||||
if err_command in commands:
|
||||
commands.remove(err_command)
|
||||
continue
|
||||
return {command: [{}] for command in commands}
|
||||
logging.debug(f"{self} - (Multicommand) - Received data")
|
||||
data["multicommand"] = True
|
||||
@@ -206,16 +207,18 @@ If you are sure you want to use this command please use API.send_command("{comma
|
||||
logging.debug(f"{self} - ([Hidden] Send Bytes) - Draining")
|
||||
await writer.drain()
|
||||
try:
|
||||
ret_data = await asyncio.wait_for(reader.read(4096), timeout=timeout)
|
||||
except ConnectionAbortedError:
|
||||
return b"{}"
|
||||
try:
|
||||
# TO address a situation where a whatsminer has an unknown PW -AND-
|
||||
# Fix for stupid whatsminer bug, reboot/restart seem to not load properly in the loop
|
||||
# have to receive, save the data, check if there is more data by reading with a short timeout
|
||||
# append that data if there is more, and then onto the main loop.
|
||||
ret_data += await asyncio.wait_for(reader.read(1), timeout=1)
|
||||
except asyncio.TimeoutError:
|
||||
return ret_data
|
||||
# the password timeout might need to be longer than 1, but it seems to work for now.
|
||||
ret_data = await asyncio.wait_for(reader.read(1), timeout=1)
|
||||
except (asyncio.TimeoutError):
|
||||
return b"{}"
|
||||
try:
|
||||
ret_data += await asyncio.wait_for(reader.read(4096), timeout=timeout)
|
||||
except (ConnectionAbortedError):
|
||||
return b"{}"
|
||||
|
||||
# loop to receive all the data
|
||||
logging.debug(f"{self} - ([Hidden] Send Bytes) - Receiving")
|
||||
@@ -258,9 +261,12 @@ If you are sure you want to use this command please use API.send_command("{comma
|
||||
return False, data["Msg"]
|
||||
else:
|
||||
# make sure the command succeeded
|
||||
if type(data["STATUS"]) == str:
|
||||
if isinstance(data["STATUS"], str):
|
||||
if data["STATUS"] in ["RESTART"]:
|
||||
return True, None
|
||||
elif isinstance(data["STATUS"], dict):
|
||||
if data["STATUS"].get("STATUS") in ["S", "I"]:
|
||||
return True, None
|
||||
elif data["STATUS"][0]["STATUS"] not in ("S", "I"):
|
||||
# this is an error
|
||||
if data["STATUS"][0]["STATUS"] not in ("S", "I"):
|
||||
|
||||
@@ -22,15 +22,15 @@ import hashlib
|
||||
import json
|
||||
import logging
|
||||
import re
|
||||
from typing import Union
|
||||
from typing import Literal, Union
|
||||
|
||||
from cryptography.hazmat.primitives.ciphers import Cipher, algorithms, modes
|
||||
from passlib.handlers.md5_crypt import md5_crypt
|
||||
|
||||
from pyasic import settings
|
||||
from pyasic.API import BaseMinerAPI
|
||||
from pyasic.errors import APIError
|
||||
from pyasic.misc import api_min_version
|
||||
from pyasic.settings import PyasicSettings
|
||||
|
||||
### IMPORTANT ###
|
||||
# you need to change the password of the miners using the Whatsminer
|
||||
@@ -40,6 +40,12 @@ from pyasic.settings import PyasicSettings
|
||||
# you change the password, you can pass that to this class as pwd,
|
||||
# or add it as the Whatsminer_pwd in the settings.toml file.
|
||||
|
||||
PrePowerOnMessage = Union[
|
||||
Literal["wait for adjust temp"],
|
||||
Literal["adjust complete"],
|
||||
Literal["adjust continue"],
|
||||
]
|
||||
|
||||
|
||||
def _crypt(word: str, salt: str) -> str:
|
||||
"""Encrypts a word with a salt, using a standard salt format.
|
||||
@@ -186,7 +192,7 @@ class BTMinerAPI(BaseMinerAPI):
|
||||
ip: str,
|
||||
api_ver: str = "0.0.0",
|
||||
port: int = 4028,
|
||||
pwd: str = PyasicSettings().global_whatsminer_password,
|
||||
pwd: str = settings.get("default_whatsminer_password", "admin"),
|
||||
):
|
||||
super().__init__(ip, port)
|
||||
self.pwd = pwd
|
||||
@@ -693,7 +699,7 @@ class BTMinerAPI(BaseMinerAPI):
|
||||
)
|
||||
return await self.send_privileged_command("set_power_pct", percent=str(percent))
|
||||
|
||||
async def pre_power_on(self, complete: bool, msg: str) -> dict:
|
||||
async def pre_power_on(self, complete: bool, msg: PrePowerOnMessage) -> dict:
|
||||
"""Configure or check status of pre power on.
|
||||
|
||||
<details>
|
||||
@@ -713,7 +719,7 @@ class BTMinerAPI(BaseMinerAPI):
|
||||
</details>
|
||||
"""
|
||||
|
||||
if not msg == "wait for adjust temp" or "adjust complete" or "adjust continue":
|
||||
if msg not in ("wait for adjust temp", "adjust complete", "adjust continue"):
|
||||
raise APIError(
|
||||
"Message is incorrect, please choose one of "
|
||||
'["wait for adjust temp", '
|
||||
@@ -729,6 +735,34 @@ class BTMinerAPI(BaseMinerAPI):
|
||||
)
|
||||
|
||||
### ADDED IN V2.0.5 Whatsminer API ###
|
||||
|
||||
@api_min_version("2.0.5")
|
||||
async def set_power_pct_v2(self, percent: int) -> dict:
|
||||
"""Set the power percentage of the miner based on current power. Used for temporary adjustment. Added in API v2.0.5.
|
||||
|
||||
<details>
|
||||
<summary>Expand</summary>
|
||||
|
||||
Set the power percentage of the miner, only works after changing
|
||||
the password of the miner using the Whatsminer tool.
|
||||
|
||||
Parameters:
|
||||
percent: The power percentage to set.
|
||||
Returns:
|
||||
A reply informing of the status of setting the power percentage.
|
||||
</details>
|
||||
"""
|
||||
|
||||
if not 0 < percent < 100:
|
||||
raise APIError(
|
||||
f"Power PCT % is outside of the allowed "
|
||||
f"range. Please set a % between 0 and "
|
||||
f"100"
|
||||
)
|
||||
return await self.send_privileged_command(
|
||||
"set_power_pct_v2", percent=str(percent)
|
||||
)
|
||||
|
||||
@api_min_version("2.0.5")
|
||||
async def set_temp_offset(self, temp_offset: int) -> dict:
|
||||
"""Set the offset of miner hash board target temperature.
|
||||
|
||||
@@ -13,6 +13,7 @@
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
from pyasic import settings
|
||||
from pyasic.API.bmminer import BMMinerAPI
|
||||
from pyasic.API.bosminer import BOSMinerAPI
|
||||
from pyasic.API.btminer import BTMinerAPI
|
||||
@@ -29,10 +30,9 @@ from pyasic.data import (
|
||||
from pyasic.errors import APIError, APIWarning
|
||||
from pyasic.miners import get_miner
|
||||
from pyasic.miners.base import AnyMiner
|
||||
from pyasic.miners.miner_factory import MinerFactory
|
||||
from pyasic.miners.miner_factory import MinerFactory, miner_factory
|
||||
from pyasic.miners.miner_listener import MinerListener
|
||||
from pyasic.network import MinerNetwork
|
||||
from pyasic.settings import PyasicSettings
|
||||
|
||||
__all__ = [
|
||||
"BMMinerAPI",
|
||||
@@ -51,7 +51,8 @@ __all__ = [
|
||||
"get_miner",
|
||||
"AnyMiner",
|
||||
"MinerFactory",
|
||||
"miner_factory",
|
||||
"MinerListener",
|
||||
"MinerNetwork",
|
||||
"PyasicSettings",
|
||||
"settings",
|
||||
]
|
||||
|
||||
@@ -241,7 +241,7 @@ class MinerData:
|
||||
if item.hashrate is not None:
|
||||
hr_data.append(item.hashrate)
|
||||
if len(hr_data) > 0:
|
||||
return sum(hr_data)
|
||||
return round(sum(hr_data), 2)
|
||||
return self._hashrate
|
||||
|
||||
@hashrate.setter
|
||||
@@ -338,13 +338,16 @@ class MinerData:
|
||||
pass
|
||||
|
||||
def asdict(self) -> dict:
|
||||
logging.debug(f"MinerData - (To Dict) - Dumping Dict data")
|
||||
return asdict(self)
|
||||
|
||||
def as_dict(self) -> dict:
|
||||
"""Get this dataclass as a dictionary.
|
||||
|
||||
Returns:
|
||||
A dictionary version of this class.
|
||||
"""
|
||||
logging.debug(f"MinerData - (To Dict) - Dumping Dict data")
|
||||
return asdict(self)
|
||||
return self.asdict()
|
||||
|
||||
def as_json(self) -> str:
|
||||
"""Get this dataclass as JSON.
|
||||
|
||||
@@ -16,8 +16,6 @@
|
||||
|
||||
from dataclasses import asdict, dataclass, field, fields
|
||||
|
||||
C_N_CODES = ["52", "53", "54", "55", "56"]
|
||||
|
||||
|
||||
@dataclass
|
||||
class WhatsminerError:
|
||||
@@ -37,10 +35,8 @@ class WhatsminerError:
|
||||
|
||||
@property
|
||||
def error_message(self): # noqa - Skip PyCharm inspection
|
||||
if len(str(self.error_code)) > 3 and str(self.error_code)[:2] in C_N_CODES:
|
||||
# 55 error code base has chip numbers, so the format is
|
||||
# 55 -> board num len 1 -> chip num len 3
|
||||
err_type = 55
|
||||
if len(str(self.error_code)) == 6 and not str(self.error_code)[:1] == "1":
|
||||
err_type = int(str(self.error_code)[:2])
|
||||
err_subtype = int(str(self.error_code)[2:3])
|
||||
err_value = int(str(self.error_code)[3:])
|
||||
else:
|
||||
@@ -88,7 +84,9 @@ class WhatsminerError:
|
||||
|
||||
ERROR_CODES = {
|
||||
1: { # Fan error
|
||||
0: {0: "Fan unknown."},
|
||||
0: {
|
||||
0: "Fan unknown.",
|
||||
},
|
||||
1: { # Fan speed error of 1000+
|
||||
0: "Intake fan speed error.",
|
||||
1: "Exhaust fan speed error.",
|
||||
@@ -101,7 +99,9 @@ ERROR_CODES = {
|
||||
0: "Intake fan speed error. Fan speed deviates by more than 3000.",
|
||||
1: "Exhaust fan speed error. Fan speed deviates by more than 3000.",
|
||||
},
|
||||
4: {0: "Fan speed too high."}, # High speed
|
||||
4: {
|
||||
0: "Fan speed too high.",
|
||||
}, # High speed
|
||||
},
|
||||
2: { # Power error
|
||||
0: {
|
||||
@@ -126,6 +126,7 @@ ERROR_CODES = {
|
||||
6: "Power remained unchanged for a long time.",
|
||||
7: "Power set enable error.",
|
||||
8: "Power input voltage is lower than 230V for high power mode.",
|
||||
9: "Power input current is incorrect.",
|
||||
},
|
||||
3: {
|
||||
3: "Power output high temperature protection error.",
|
||||
@@ -159,6 +160,8 @@ ERROR_CODES = {
|
||||
6: {
|
||||
3: "Power communication warning.",
|
||||
4: "Power communication error.",
|
||||
5: "Power unknown error.",
|
||||
6: "Power unknown error.",
|
||||
7: "Power watchdog protection.",
|
||||
8: "Power output high current protection.",
|
||||
9: "Power input high current protection.",
|
||||
@@ -170,57 +173,134 @@ ERROR_CODES = {
|
||||
3: "Power input too high warning.",
|
||||
4: "Power fan warning.",
|
||||
5: "Power high temperature warning.",
|
||||
6: "Power unknown error.",
|
||||
7: "Power unknown error.",
|
||||
8: "Power unknown error.",
|
||||
9: "Power unknown error.",
|
||||
},
|
||||
8: {
|
||||
0: "Power unknown error.",
|
||||
1: "Power vendor status 1 bit 0 error.",
|
||||
2: "Power vendor status 1 bit 1 error.",
|
||||
3: "Power vendor status 1 bit 2 error.",
|
||||
4: "Power vendor status 1 bit 3 error.",
|
||||
5: "Power vendor status 1 bit 4 error.",
|
||||
6: "Power vendor status 1 bit 5 error.",
|
||||
7: "Power vendor status 1 bit 6 error.",
|
||||
8: "Power vendor status 1 bit 7 error.",
|
||||
9: "Power vendor status 2 bit 0 error.",
|
||||
},
|
||||
9: {
|
||||
0: "Power vendor status 2 bit 1 error.",
|
||||
1: "Power vendor status 2 bit 2 error.",
|
||||
2: "Power vendor status 2 bit 3 error.",
|
||||
3: "Power vendor status 2 bit 4 error.",
|
||||
4: "Power vendor status 2 bit 5 error.",
|
||||
5: "Power vendor status 2 bit 6 error.",
|
||||
6: "Power vendor status 2 bit 7 error.",
|
||||
},
|
||||
},
|
||||
3: { # temperature error
|
||||
0: { # sensor detection error
|
||||
"n": "Slot {n} temperature sensor detection error."
|
||||
"n": "Slot {n} temperature sensor detection error.",
|
||||
},
|
||||
2: { # temperature reading error
|
||||
"n": "Slot {n} temperature reading error.",
|
||||
9: "Control board temperature sensor communication error.",
|
||||
},
|
||||
5: {"n": "Slot {n} temperature protecting."}, # temperature protection
|
||||
6: {0: "Hashboard high temperature error."}, # high temp
|
||||
5: {
|
||||
"n": "Slot {n} temperature protecting.",
|
||||
}, # temperature protection
|
||||
6: {
|
||||
0: "Hashboard high temperature error.",
|
||||
1: "Hashboard high temperature error.",
|
||||
2: "Hashboard high temperature error.",
|
||||
3: "Hashboard high temperature error.",
|
||||
}, # high temp
|
||||
7: {
|
||||
0: "The environment temperature fluctuates too much.",
|
||||
}, # env temp
|
||||
8: {
|
||||
0: "Humidity sensor not found.",
|
||||
1: "Humidity sensor read error.",
|
||||
2: "Humidity sensor read error.",
|
||||
3: "Humidity sensor protecting.",
|
||||
},
|
||||
}, # humidity
|
||||
},
|
||||
4: { # EEPROM error
|
||||
0: {0: "Eeprom unknown error."},
|
||||
1: {"n": "Slot {n} eeprom detection error."}, # EEPROM detection error
|
||||
2: {"n": "Slot {n} eeprom parsing error."}, # EEPROM parsing error
|
||||
3: {"n": "Slot {n} chip bin type error."}, # chip bin error
|
||||
4: {"n": "Slot {n} eeprom chip number X error."}, # EEPROM chip number error
|
||||
5: {"n": "Slot {n} eeprom xfer error."}, # EEPROM xfer error
|
||||
0: {
|
||||
0: "Eeprom unknown error.",
|
||||
},
|
||||
1: {
|
||||
"n": "Slot {n} eeprom detection error.",
|
||||
}, # EEPROM detection error
|
||||
2: {
|
||||
"n": "Slot {n} eeprom parsing error.",
|
||||
}, # EEPROM parsing error
|
||||
3: {
|
||||
"n": "Slot {n} chip bin type error.",
|
||||
}, # chip bin error
|
||||
4: {
|
||||
"n": "Slot {n} eeprom chip number X error.",
|
||||
}, # EEPROM chip number error
|
||||
5: {
|
||||
"n": "Slot {n} eeprom xfer error.",
|
||||
}, # EEPROM xfer error
|
||||
},
|
||||
5: { # hashboard error
|
||||
0: {0: "Board unknown error."},
|
||||
1: {"n": "Slot {n} miner type error."}, # board miner type error
|
||||
2: {"n": "Slot {n} bin type error."}, # chip bin type error
|
||||
3: {"n": "Slot {n} not found."}, # board not found error
|
||||
4: {"n": "Slot {n} error reading chip id."}, # reading chip id error
|
||||
5: {"n": "Slot {n} has bad chips."}, # board has bad chips error
|
||||
6: {"n": "Slot {n} loss of balance error."}, # loss of balance error
|
||||
7: {"n": "Slot {n} xfer error chip."}, # xfer error
|
||||
8: {"n": "Slot {n} reset error."}, # reset error
|
||||
9: {"n": "Slot {n} frequency too low."}, # freq error
|
||||
0: {
|
||||
0: "Board unknown error.",
|
||||
},
|
||||
1: {
|
||||
"n": "Slot {n} miner type error.",
|
||||
}, # board miner type error
|
||||
2: {
|
||||
"n": "Slot {n} bin type error.",
|
||||
}, # chip bin type error
|
||||
3: {
|
||||
"n": "Slot {n} not found.",
|
||||
}, # board not found error
|
||||
4: {
|
||||
"n": "Slot {n} error reading chip id.",
|
||||
}, # reading chip id error
|
||||
5: {
|
||||
"n": "Slot {n} has bad chips.",
|
||||
}, # board has bad chips error
|
||||
6: {
|
||||
"n": "Slot {n} loss of balance error.",
|
||||
}, # loss of balance error
|
||||
7: {
|
||||
"n": "Slot {n} xfer error chip.",
|
||||
}, # xfer error
|
||||
8: {
|
||||
"n": "Slot {n} reset error.",
|
||||
}, # reset error
|
||||
9: {
|
||||
"n": "Slot {n} frequency too low.",
|
||||
}, # freq error
|
||||
},
|
||||
6: { # env temp error
|
||||
0: {0: "Environment temperature is too high."}, # normal env temp error
|
||||
0: {
|
||||
0: "Environment temperature is too high.",
|
||||
}, # normal env temp error
|
||||
1: { # high power env temp error
|
||||
0: "Environment temperature is too high for high performance mode."
|
||||
0: "Environment temperature is too high for high performance mode.",
|
||||
},
|
||||
},
|
||||
7: { # control board error
|
||||
0: {0: "MAC address invalid", 1: "Control board no support chip."},
|
||||
0: {
|
||||
0: "MAC address invalid",
|
||||
1: "Control board no support chip.",
|
||||
},
|
||||
1: {
|
||||
0: "Control board rebooted as an exception.",
|
||||
1: "Control board rebooted as exception and cpufreq reduced, please upgrade the firmware",
|
||||
2: "Control board rebooted as an exception.",
|
||||
3: "The network is unstable, change time.",
|
||||
4: "Unknown error.",
|
||||
},
|
||||
2: {
|
||||
"n": "Control board slot {n} frame error.",
|
||||
},
|
||||
},
|
||||
8: { # checksum error
|
||||
@@ -228,63 +308,152 @@ ERROR_CODES = {
|
||||
0: "CGMiner checksum error.",
|
||||
1: "System monitor checksum error.",
|
||||
2: "Remote daemon checksum error.",
|
||||
}
|
||||
},
|
||||
1: {0: "Air to liquid PCB serial # does not match."},
|
||||
},
|
||||
9: {0: {1: "Power rate error."}}, # power rate error
|
||||
9: {
|
||||
0: {0: "Unknown error.", 1: "Power rate error.", 2: "Unknown error."}
|
||||
}, # power rate error
|
||||
20: { # pool error
|
||||
1: {0: "All pools are disabled."}, # all disabled error
|
||||
2: {"n": "Pool {n} connection failed."}, # pool connection failed error
|
||||
3: {0: "High rejection rate on pool."}, # rejection rate error
|
||||
0: {
|
||||
0: "No pool information configured.",
|
||||
},
|
||||
1: {
|
||||
0: "All pools are disabled.",
|
||||
}, # all disabled error
|
||||
2: {
|
||||
"n": "Pool {n} connection failed.",
|
||||
}, # pool connection failed error
|
||||
3: {
|
||||
0: "High rejection rate on pool.",
|
||||
}, # rejection rate error
|
||||
4: { # asicboost not supported error
|
||||
0: "The pool does not support asicboost mode."
|
||||
0: "The pool does not support asicboost mode.",
|
||||
},
|
||||
},
|
||||
21: {1: {"n": "Slot {n} factory test step failed."}},
|
||||
21: {
|
||||
1: {
|
||||
"n": "Slot {n} factory test step failed.",
|
||||
}
|
||||
},
|
||||
23: { # hashrate error
|
||||
1: {0: "Hashrate is too low."},
|
||||
2: {0: "Hashrate is too low."},
|
||||
3: {0: "Hashrate loss is too high."},
|
||||
4: {0: "Hashrate loss is too high."},
|
||||
5: {0: "Hashrate loss."},
|
||||
1: {
|
||||
0: "Hashrate is too low.",
|
||||
},
|
||||
2: {
|
||||
0: "Hashrate is too low.",
|
||||
},
|
||||
3: {
|
||||
0: "Hashrate loss is too high.",
|
||||
},
|
||||
4: {
|
||||
0: "Hashrate loss is too high.",
|
||||
},
|
||||
5: {
|
||||
0: "Hashrate loss.",
|
||||
},
|
||||
},
|
||||
50: { # water velocity error/voltage error
|
||||
1: {"n": "Slot {n} chip voltage too low."},
|
||||
2: {"n": "Slot {n} chip voltage changed."},
|
||||
3: {"n": "Slot {n} chip temperature difference is too large."},
|
||||
4: {"n": "Slot {n} chip hottest temperature difference is too large."},
|
||||
7: {"n": "Slot {n} water velocity is abnormal."}, # abnormal water velocity
|
||||
8: {0: "Chip temp calibration failed, please restore factory settings."},
|
||||
9: {"n": "Slot {n} chip temp calibration check no balance."},
|
||||
1: {
|
||||
"n": "Slot {n} chip voltage too low.",
|
||||
},
|
||||
2: {
|
||||
"n": "Slot {n} chip voltage changed.",
|
||||
},
|
||||
3: {
|
||||
"n": "Slot {n} chip temperature difference is too large.",
|
||||
},
|
||||
4: {
|
||||
"n": "Slot {n} chip hottest temperature difference is too large.",
|
||||
},
|
||||
5: {"n": "Slot {n} stopped hashing, chips temperature protecting."},
|
||||
7: {
|
||||
"n": "Slot {n} water velocity is abnormal.",
|
||||
}, # abnormal water velocity
|
||||
8: {
|
||||
0: "Chip temp calibration failed, please restore factory settings.",
|
||||
},
|
||||
9: {
|
||||
"n": "Slot {n} chip temp calibration check no balance.",
|
||||
},
|
||||
},
|
||||
51: { # frequency error
|
||||
1: {"n": "Slot {n} frequency up timeout."}, # frequency up timeout
|
||||
7: {"n": "Slot {n} frequency up timeout."}, # frequency up timeout
|
||||
1: {
|
||||
"n": "Slot {n} frequency up timeout.",
|
||||
}, # frequency up timeout
|
||||
2: {"n": "Slot {n} too many CRC errors."},
|
||||
3: {"n": "Slot {n} unstable."},
|
||||
7: {
|
||||
"n": "Slot {n} frequency up timeout.",
|
||||
}, # frequency up timeout
|
||||
},
|
||||
52: {
|
||||
"n": {
|
||||
"c": "Slot {n} chip {c} error nonce.",
|
||||
},
|
||||
},
|
||||
53: {
|
||||
"n": {
|
||||
"c": "Slot {n} chip {c} too few nonce.",
|
||||
},
|
||||
},
|
||||
54: {
|
||||
"n": {
|
||||
"c": "Slot {n} chip {c} temp protected.",
|
||||
},
|
||||
},
|
||||
55: {
|
||||
"n": {
|
||||
"c": "Slot {n} chip {c} has been reset.",
|
||||
},
|
||||
},
|
||||
56: {
|
||||
"n": {
|
||||
"c": "Slot {n} chip {c} zero nonce.",
|
||||
},
|
||||
},
|
||||
52: {"n": {"c": "Slot {n} chip {c} error nonce."}},
|
||||
53: {"n": {"c": "Slot {n} chip {c} too few nonce."}},
|
||||
54: {"n": {"c": "Slot {n} chip {c} temp protected."}},
|
||||
55: {"n": {"c": "Slot {n} chip {c} has been reset."}},
|
||||
56: {"n": {"c": "Slot {n} chip {c} does not return to the nonce."}},
|
||||
80: {
|
||||
0: {0: "The tool version is too low, please update."},
|
||||
1: {0: "Low freq."},
|
||||
2: {0: "Low hashrate."},
|
||||
3: {5: "High env temp."},
|
||||
0: {
|
||||
0: "The tool version is too low, please update.",
|
||||
},
|
||||
1: {
|
||||
0: "Low freq.",
|
||||
},
|
||||
2: {
|
||||
0: "Low hashrate.",
|
||||
},
|
||||
3: {
|
||||
5: "High env temp.",
|
||||
},
|
||||
},
|
||||
81: {
|
||||
0: {0: "Chip data error."},
|
||||
0: {
|
||||
0: "Chip data error.",
|
||||
},
|
||||
},
|
||||
82: {
|
||||
0: {0: "Power version error."},
|
||||
1: {0: "Miner type error."},
|
||||
2: {0: "Version info error."},
|
||||
0: {
|
||||
0: "Power version error.",
|
||||
},
|
||||
1: {
|
||||
0: "Miner type error.",
|
||||
},
|
||||
2: {
|
||||
0: "Version info error.",
|
||||
},
|
||||
},
|
||||
83: {
|
||||
0: {0: "Empty level error."},
|
||||
0: {
|
||||
0: "Empty level error.",
|
||||
},
|
||||
},
|
||||
84: {
|
||||
0: {0: "Old firmware."},
|
||||
1: {0: "Software version error."},
|
||||
0: {
|
||||
0: "Old firmware.",
|
||||
},
|
||||
1: {
|
||||
0: "Software version error.",
|
||||
},
|
||||
},
|
||||
85: {
|
||||
"n": {
|
||||
@@ -296,8 +465,12 @@ ERROR_CODES = {
|
||||
},
|
||||
},
|
||||
86: {
|
||||
0: {0: "Missing product serial #."},
|
||||
1: {0: "Missing product type."},
|
||||
0: {
|
||||
0: "Missing product serial #.",
|
||||
},
|
||||
1: {
|
||||
0: "Missing product type.",
|
||||
},
|
||||
2: {
|
||||
0: "Missing miner serial #.",
|
||||
1: "Wrong miner serial # length.",
|
||||
@@ -314,12 +487,34 @@ ERROR_CODES = {
|
||||
3: "Wrong power model rate.",
|
||||
4: "Wrong power model format.",
|
||||
},
|
||||
5: {0: "Wrong hash board struct."},
|
||||
6: {0: "Wrong miner cooling type."},
|
||||
7: {0: "Missing PCB serial #."},
|
||||
5: {
|
||||
0: "Wrong hash board struct.",
|
||||
},
|
||||
6: {
|
||||
0: "Wrong miner cooling type.",
|
||||
},
|
||||
7: {
|
||||
0: "Missing PCB serial #.",
|
||||
},
|
||||
},
|
||||
87: {
|
||||
0: {
|
||||
0: "Miner power mismatch.",
|
||||
},
|
||||
},
|
||||
90: {
|
||||
0: {
|
||||
0: "Process error, exited with signal: 3.",
|
||||
},
|
||||
1: {
|
||||
0: "Process error, exited with signal: 3.",
|
||||
},
|
||||
},
|
||||
99: {
|
||||
9: {
|
||||
9: "Miner unknown error.",
|
||||
},
|
||||
},
|
||||
87: {0: {0: "Miner power mismatch."}},
|
||||
99: {9: {9: "Miner unknown error."}},
|
||||
1000: {
|
||||
0: {
|
||||
0: "Security library error, please upgrade firmware",
|
||||
@@ -328,7 +523,11 @@ ERROR_CODES = {
|
||||
3: "/antiv/dig/pf_partial.dig illegal.",
|
||||
},
|
||||
},
|
||||
1001: {0: {0: "Security BTMiner removed, please upgrade firmware."}},
|
||||
1001: {
|
||||
0: {
|
||||
0: "Security BTMiner removed, please upgrade firmware.",
|
||||
},
|
||||
},
|
||||
1100: {
|
||||
0: {
|
||||
0: "Security illegal file, please upgrade firmware.",
|
||||
|
||||
@@ -16,31 +16,29 @@
|
||||
|
||||
import logging
|
||||
|
||||
from pyasic.settings import PyasicSettings
|
||||
|
||||
|
||||
def init_logger():
|
||||
if PyasicSettings().logfile:
|
||||
logging.basicConfig(
|
||||
filename="logfile.txt",
|
||||
filemode="a",
|
||||
format="%(pathname)s:%(lineno)d in %(funcName)s\n[%(levelname)s][%(asctime)s](%(name)s) - %(message)s",
|
||||
datefmt="%x %X",
|
||||
)
|
||||
else:
|
||||
logging.basicConfig(
|
||||
format="%(pathname)s:%(lineno)d in %(funcName)s\n[%(levelname)s][%(asctime)s](%(name)s) - %(message)s",
|
||||
datefmt="%x %X",
|
||||
)
|
||||
# if PyasicSettings().logfile:
|
||||
# logging.basicConfig(
|
||||
# filename="logfile.txt",
|
||||
# filemode="a",
|
||||
# format="%(pathname)s:%(lineno)d in %(funcName)s\n[%(levelname)s][%(asctime)s](%(name)s) - %(message)s",
|
||||
# datefmt="%x %X",
|
||||
# )
|
||||
# else:
|
||||
logging.basicConfig(
|
||||
format="%(pathname)s:%(lineno)d in %(funcName)s\n[%(levelname)s][%(asctime)s](%(name)s) - %(message)s",
|
||||
datefmt="%x %X",
|
||||
)
|
||||
|
||||
_logger = logging.getLogger()
|
||||
|
||||
if PyasicSettings().debug:
|
||||
_logger.setLevel(logging.DEBUG)
|
||||
logging.getLogger("asyncssh").setLevel(logging.DEBUG)
|
||||
else:
|
||||
_logger.setLevel(logging.WARNING)
|
||||
logging.getLogger("asyncssh").setLevel(logging.WARNING)
|
||||
# if PyasicSettings().debug:
|
||||
# _logger.setLevel(logging.DEBUG)
|
||||
# logging.getLogger("asyncssh").setLevel(logging.DEBUG)
|
||||
# else:
|
||||
_logger.setLevel(logging.WARNING)
|
||||
logging.getLogger("asyncssh").setLevel(logging.WARNING)
|
||||
|
||||
return _logger
|
||||
|
||||
|
||||
@@ -21,9 +21,11 @@ from pyasic.miners.types import (
|
||||
S19XP,
|
||||
S19a,
|
||||
S19aPro,
|
||||
S19i,
|
||||
S19j,
|
||||
S19jNoPIC,
|
||||
S19jPro,
|
||||
S19Plus,
|
||||
S19Pro,
|
||||
S19ProPlus,
|
||||
)
|
||||
@@ -33,6 +35,14 @@ class BMMinerS19(AntminerModern, S19):
|
||||
pass
|
||||
|
||||
|
||||
class BMMinerS19Plus(AntminerModern, S19Plus):
|
||||
pass
|
||||
|
||||
|
||||
class BMMinerS19i(AntminerModern, S19i):
|
||||
pass
|
||||
|
||||
|
||||
class BMMinerS19Pro(AntminerModern, S19Pro):
|
||||
pass
|
||||
|
||||
|
||||
@@ -18,10 +18,12 @@ from .S19 import (
|
||||
BMMinerS19,
|
||||
BMMinerS19a,
|
||||
BMMinerS19aPro,
|
||||
BMMinerS19i,
|
||||
BMMinerS19j,
|
||||
BMMinerS19jNoPIC,
|
||||
BMMinerS19jPro,
|
||||
BMMinerS19L,
|
||||
BMMinerS19Plus,
|
||||
BMMinerS19Pro,
|
||||
BMMinerS19ProPlus,
|
||||
BMMinerS19XP,
|
||||
|
||||
@@ -109,7 +109,7 @@ class AntminerModern(BMMiner):
|
||||
data = await self.web.blink(blink=False)
|
||||
if data:
|
||||
if data.get("code") == "B100":
|
||||
self.light = True
|
||||
self.light = False
|
||||
return self.light
|
||||
|
||||
async def reboot(self) -> bool:
|
||||
@@ -274,7 +274,11 @@ class AntminerModern(BMMiner):
|
||||
|
||||
if web_get_conf:
|
||||
try:
|
||||
return False if int(web_get_conf["bitmain-work-mode"]) == 1 else True
|
||||
if web_get_conf["bitmain-work-mode"].isdigit():
|
||||
return (
|
||||
False if int(web_get_conf["bitmain-work-mode"]) == 1 else True
|
||||
)
|
||||
return False
|
||||
except LookupError:
|
||||
pass
|
||||
|
||||
|
||||
@@ -235,7 +235,22 @@ class BMMiner(BaseMiner):
|
||||
if board_offset == -1:
|
||||
board_offset = 1
|
||||
|
||||
for i in range(board_offset, board_offset + self.ideal_hashboards):
|
||||
real_slots = []
|
||||
|
||||
for i in range(board_offset, board_offset + 4):
|
||||
try:
|
||||
key = f"chain_acs{i}"
|
||||
if boards[1].get(key, "") != "":
|
||||
real_slots.append(i)
|
||||
except LookupError:
|
||||
pass
|
||||
|
||||
if len(real_slots) < 3:
|
||||
real_slots = list(
|
||||
range(board_offset, board_offset + self.ideal_hashboards)
|
||||
)
|
||||
|
||||
for i in real_slots:
|
||||
hashboard = HashBoard(
|
||||
slot=i - board_offset, expected_chips=self.nominal_chips
|
||||
)
|
||||
@@ -259,7 +274,7 @@ class BMMiner(BaseMiner):
|
||||
if (not chips) or (not chips > 0):
|
||||
hashboard.missing = True
|
||||
hashboards.append(hashboard)
|
||||
except (IndexError, KeyError, ValueError, TypeError):
|
||||
except (LookupError, ValueError, TypeError):
|
||||
pass
|
||||
|
||||
return hashboards
|
||||
|
||||
@@ -184,11 +184,11 @@ BOSMINER_DATA_LOC = {
|
||||
|
||||
|
||||
class BOSMiner(BaseMiner):
|
||||
def __init__(self, ip: str, api_ver: str = "0.0.0") -> None:
|
||||
def __init__(self, ip: str, api_ver: str = "0.0.0", boser: bool = None) -> None:
|
||||
super().__init__(ip)
|
||||
# interfaces
|
||||
self.api = BOSMinerAPI(ip, api_ver)
|
||||
self.web = BOSMinerWebAPI(ip)
|
||||
self.web = BOSMinerWebAPI(ip, boser=boser)
|
||||
|
||||
# static data
|
||||
self.api_type = "BOSMiner"
|
||||
@@ -303,17 +303,12 @@ class BOSMiner(BaseMiner):
|
||||
The config from `self.config`.
|
||||
"""
|
||||
logging.debug(f"{self}: Getting config.")
|
||||
conn = None
|
||||
|
||||
try:
|
||||
conn = await self._get_ssh_connection()
|
||||
except ConnectionError:
|
||||
try:
|
||||
pools = await self.api.pools()
|
||||
except APIError:
|
||||
return self.config
|
||||
if pools:
|
||||
self.config = MinerConfig().from_api(pools["POOLS"])
|
||||
return self.config
|
||||
conn = None
|
||||
|
||||
if conn:
|
||||
async with conn:
|
||||
# good ol' BBB compatibility :/
|
||||
@@ -365,6 +360,8 @@ class BOSMiner(BaseMiner):
|
||||
async def set_power_limit(self, wattage: int) -> bool:
|
||||
try:
|
||||
cfg = await self.get_config()
|
||||
if cfg is None:
|
||||
return False
|
||||
cfg.autotuning_wattage = wattage
|
||||
await self.send_config(cfg)
|
||||
except Exception as e:
|
||||
@@ -790,7 +787,7 @@ class BOSMiner(BaseMiner):
|
||||
)
|
||||
except APIError:
|
||||
pass
|
||||
if graphql_fans:
|
||||
if graphql_fans.get("data"):
|
||||
fans = []
|
||||
for n in range(self.fan_count):
|
||||
try:
|
||||
@@ -1046,7 +1043,7 @@ class BOSMiner(BaseMiner):
|
||||
if data == "50":
|
||||
self.light = True
|
||||
return self.light
|
||||
except TypeError:
|
||||
except (TypeError, AttributeError):
|
||||
return self.light
|
||||
|
||||
async def get_nominal_hashrate(self, api_devs: dict = None) -> Optional[float]:
|
||||
|
||||
@@ -442,7 +442,8 @@ class BTMiner(BaseMiner):
|
||||
|
||||
if api_summary:
|
||||
try:
|
||||
return api_summary["SUMMARY"][0]["Power"]
|
||||
wattage = api_summary["SUMMARY"][0]["Power"]
|
||||
return wattage if not wattage == -1 else None
|
||||
except (KeyError, IndexError):
|
||||
pass
|
||||
|
||||
|
||||
@@ -179,11 +179,12 @@ class CGMinerAvalon(CGMiner):
|
||||
pass
|
||||
|
||||
async def get_hostname(self, mac: str = None) -> Optional[str]:
|
||||
if not mac:
|
||||
mac = await self.get_mac()
|
||||
|
||||
if mac:
|
||||
return f"Avalon{mac.replace(':', '')[-6:]}"
|
||||
return None
|
||||
# if not mac:
|
||||
# mac = await self.get_mac()
|
||||
#
|
||||
# if mac:
|
||||
# return f"Avalon{mac.replace(':', '')[-6:]}"
|
||||
|
||||
async def get_hashrate(self, api_devs: dict = None) -> Optional[float]:
|
||||
if not api_devs:
|
||||
|
||||
@@ -413,21 +413,29 @@ class BaseMiner(ABC):
|
||||
"""
|
||||
pass
|
||||
|
||||
async def _get_data(self, allow_warning: bool, data_to_get: list = None) -> dict:
|
||||
if not data_to_get:
|
||||
async def _get_data(
|
||||
self, allow_warning: bool, include: list = None, exclude: list = None
|
||||
) -> dict:
|
||||
if include is None:
|
||||
# everything
|
||||
data_to_get = list(self.data_locations.keys())
|
||||
include = list(self.data_locations.keys())
|
||||
|
||||
if exclude is not None:
|
||||
for item in exclude:
|
||||
if item in include:
|
||||
include.remove(item)
|
||||
|
||||
api_multicommand = set()
|
||||
web_multicommand = set()
|
||||
for data_name in data_to_get:
|
||||
web_multicommand = []
|
||||
for data_name in include:
|
||||
try:
|
||||
fn_args = self.data_locations[data_name]["kwargs"]
|
||||
for arg_name in fn_args:
|
||||
if fn_args[arg_name].get("api"):
|
||||
api_multicommand.add(fn_args[arg_name]["api"])
|
||||
if fn_args[arg_name].get("web"):
|
||||
web_multicommand.add(fn_args[arg_name]["web"])
|
||||
if not fn_args[arg_name]["web"] in web_multicommand:
|
||||
web_multicommand.append(fn_args[arg_name]["web"])
|
||||
except KeyError as e:
|
||||
logger.error(e, data_name)
|
||||
continue
|
||||
@@ -445,8 +453,6 @@ class BaseMiner(ABC):
|
||||
else:
|
||||
web_command_task = asyncio.sleep(0)
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
web_command_data = await web_command_task
|
||||
if web_command_data is None:
|
||||
web_command_data = {}
|
||||
@@ -457,7 +463,7 @@ class BaseMiner(ABC):
|
||||
|
||||
miner_data = {}
|
||||
|
||||
for data_name in data_to_get:
|
||||
for data_name in include:
|
||||
try:
|
||||
fn_args = self.data_locations[data_name]["kwargs"]
|
||||
args_to_send = {k: None for k in fn_args}
|
||||
@@ -511,13 +517,14 @@ class BaseMiner(ABC):
|
||||
return miner_data
|
||||
|
||||
async def get_data(
|
||||
self, allow_warning: bool = False, data_to_get: list = None
|
||||
self, allow_warning: bool = False, include: list = None, exclude: list = None
|
||||
) -> MinerData:
|
||||
"""Get data from the miner in the form of [`MinerData`][pyasic.data.MinerData].
|
||||
|
||||
Parameters:
|
||||
allow_warning: Allow warning when an API command fails.
|
||||
data_to_get: Names of data items you want to gather. Defaults to all data.
|
||||
include: Names of data items you want to gather. Defaults to all data.
|
||||
exclude: Names of data items to exclude. Exclusion happens after considering included items.
|
||||
|
||||
Returns:
|
||||
A [`MinerData`][pyasic.data.MinerData] instance containing data from the miner.
|
||||
@@ -533,7 +540,9 @@ class BaseMiner(ABC):
|
||||
],
|
||||
)
|
||||
|
||||
gathered_data = await self._get_data(allow_warning, data_to_get=data_to_get)
|
||||
gathered_data = await self._get_data(
|
||||
allow_warning=allow_warning, include=include, exclude=exclude
|
||||
)
|
||||
for item in gathered_data:
|
||||
if gathered_data[item] is not None:
|
||||
setattr(data, item, gathered_data[item])
|
||||
|
||||
@@ -20,8 +20,9 @@ import enum
|
||||
import ipaddress
|
||||
import json
|
||||
import re
|
||||
from typing import Callable, List, Optional, Tuple, Union
|
||||
from typing import AsyncGenerator, Callable, List, Optional, Tuple, Union
|
||||
|
||||
import anyio
|
||||
import httpx
|
||||
|
||||
from pyasic.logger import logger
|
||||
@@ -85,6 +86,8 @@ MINER_CLASSES = {
|
||||
"ANTMINER S19L": BMMinerS19L,
|
||||
"ANTMINER S19 PRO": BMMinerS19Pro,
|
||||
"ANTMINER S19J": BMMinerS19j,
|
||||
"ANTMINER S19I": BMMinerS19i,
|
||||
"ANTMINER S19+": BMMinerS19Plus,
|
||||
"ANTMINER S19J88NOPIC": BMMinerS19jNoPIC,
|
||||
"ANTMINER S19PRO+": BMMinerS19ProPlus,
|
||||
"ANTMINER S19J PRO": BMMinerS19jPro,
|
||||
@@ -99,6 +102,8 @@ MINER_CLASSES = {
|
||||
"M20SV10": BTMinerM20SV10,
|
||||
"M20SV20": BTMinerM20SV20,
|
||||
"M20SV30": BTMinerM20SV30,
|
||||
"M20PV10": BTMinerM20PV10,
|
||||
"M20PV30": BTMinerM20PV30,
|
||||
"M20S+V30": BTMinerM20SPlusV30,
|
||||
"M21V10": BTMinerM21V10,
|
||||
"M21SV20": BTMinerM21SV20,
|
||||
@@ -375,7 +380,9 @@ class MinerFactory:
|
||||
def clear_cached_miners(self):
|
||||
self.cache = {}
|
||||
|
||||
async def get_multiple_miners(self, ips: List[str], limit: int = 200):
|
||||
async def get_multiple_miners(
|
||||
self, ips: List[str], limit: int = 200
|
||||
) -> List[AnyMiner]:
|
||||
results = []
|
||||
|
||||
async for miner in self.get_miner_generator(ips, limit):
|
||||
@@ -383,7 +390,7 @@ class MinerFactory:
|
||||
|
||||
return results
|
||||
|
||||
async def get_miner_generator(self, ips: list, limit: int = 200):
|
||||
async def get_miner_generator(self, ips: list, limit: int = 200) -> AsyncGenerator:
|
||||
tasks = []
|
||||
semaphore = asyncio.Semaphore(limit)
|
||||
|
||||
@@ -391,13 +398,10 @@ class MinerFactory:
|
||||
tasks.append(asyncio.create_task(self.get_miner(ip)))
|
||||
|
||||
for task in tasks:
|
||||
await semaphore.acquire()
|
||||
try:
|
||||
async with semaphore:
|
||||
result = await task
|
||||
if result is not None:
|
||||
yield result
|
||||
finally:
|
||||
semaphore.release()
|
||||
|
||||
async def get_miner(self, ip: str):
|
||||
ip = str(ip)
|
||||
@@ -434,12 +438,19 @@ class MinerFactory:
|
||||
if fn is not None:
|
||||
task = asyncio.create_task(fn(ip))
|
||||
try:
|
||||
miner_model = await asyncio.wait_for(task, timeout=30)
|
||||
miner_model = await asyncio.wait_for(task, timeout=TIMEOUT)
|
||||
except asyncio.TimeoutError:
|
||||
task.cancel()
|
||||
|
||||
boser_enabled = None
|
||||
if miner_type == MinerTypes.BRAIINS_OS:
|
||||
boser_enabled = await self.get_boser_braiins_os(ip)
|
||||
|
||||
miner = self._select_miner_from_classes(
|
||||
ip, miner_type=miner_type, miner_model=miner_model
|
||||
ip,
|
||||
miner_type=miner_type,
|
||||
miner_model=miner_model,
|
||||
boser_enabled=boser_enabled,
|
||||
)
|
||||
|
||||
if miner is not None and not isinstance(miner, UnknownMiner):
|
||||
@@ -472,7 +483,12 @@ class MinerFactory:
|
||||
try:
|
||||
resp = await session.get(url, follow_redirects=False)
|
||||
return resp.text, resp
|
||||
except (httpx.HTTPError, asyncio.TimeoutError):
|
||||
except (
|
||||
httpx.HTTPError,
|
||||
asyncio.TimeoutError,
|
||||
anyio.EndOfStream,
|
||||
anyio.ClosedResourceError,
|
||||
):
|
||||
pass
|
||||
return None, None
|
||||
|
||||
@@ -683,9 +699,13 @@ class MinerFactory:
|
||||
ip: ipaddress.ip_address,
|
||||
miner_model: Union[str, None],
|
||||
miner_type: Union[MinerTypes, None],
|
||||
boser_enabled: bool = None,
|
||||
) -> AnyMiner:
|
||||
kwargs = {}
|
||||
if boser_enabled is not None:
|
||||
kwargs["boser"] = boser_enabled
|
||||
try:
|
||||
return MINER_CLASSES[miner_type][str(miner_model).upper()](ip)
|
||||
return MINER_CLASSES[miner_type][str(miner_model).upper()](ip, **kwargs)
|
||||
except LookupError:
|
||||
if miner_type in MINER_CLASSES:
|
||||
return MINER_CLASSES[miner_type][None](ip)
|
||||
@@ -751,7 +771,8 @@ class MinerFactory:
|
||||
async def get_miner_model_whatsminer(self, ip: str):
|
||||
sock_json_data = await self.send_api_command(ip, "devdetails")
|
||||
try:
|
||||
miner_model = sock_json_data["DEVDETAILS"][0]["Model"]
|
||||
miner_model = sock_json_data["DEVDETAILS"][0]["Model"].replace("_", "")
|
||||
miner_model = miner_model[:-1] + "0"
|
||||
|
||||
return miner_model
|
||||
except (TypeError, LookupError):
|
||||
@@ -775,9 +796,9 @@ class MinerFactory:
|
||||
f"http://{ip}/api/auth",
|
||||
data={"username": "admin", "password": "admin"},
|
||||
)
|
||||
auth = (await auth_req.json())["jwt"]
|
||||
auth = auth_req.json()["jwt"]
|
||||
|
||||
web_data = await (
|
||||
web_data = (
|
||||
await session.post(
|
||||
f"http://{ip}/api/type",
|
||||
headers={"Authorization": "Bearer " + auth},
|
||||
@@ -806,12 +827,21 @@ class MinerFactory:
|
||||
json={"query": "{bosminer {info{modelName}}}"},
|
||||
)
|
||||
if d.status_code == 200:
|
||||
json_data = await d.json()
|
||||
json_data = d.json()
|
||||
miner_model = json_data["data"]["bosminer"]["info"]["modelName"]
|
||||
return miner_model
|
||||
except (httpx.HTTPError, LookupError):
|
||||
pass
|
||||
|
||||
async def get_boser_braiins_os(self, ip: str):
|
||||
# TODO: refine this check
|
||||
try:
|
||||
sock_json_data = await self.send_api_command(ip, "version")
|
||||
return sock_json_data["STATUS"][0]["Msg"].split(" ")[0].upper() == "BOSER"
|
||||
except LookupError:
|
||||
# let the bosminer class decide
|
||||
return None
|
||||
|
||||
async def get_miner_model_vnish(self, ip: str) -> Optional[str]:
|
||||
sock_json_data = await self.send_api_command(ip, "stats")
|
||||
try:
|
||||
|
||||
@@ -44,6 +44,24 @@ class S19Pro(AntMiner): # noqa - ignore ABC method implementation
|
||||
self.fan_count = 4
|
||||
|
||||
|
||||
class S19i(AntMiner): # noqa - ignore ABC method implementation
|
||||
def __init__(self, ip: str, api_ver: str = "0.0.0"):
|
||||
super().__init__(ip, api_ver)
|
||||
self.ip = ip
|
||||
self.model = "S19i"
|
||||
self.nominal_chips = 80
|
||||
self.fan_count = 4
|
||||
|
||||
|
||||
class S19Plus(AntMiner): # noqa - ignore ABC method implementation
|
||||
def __init__(self, ip: str, api_ver: str = "0.0.0"):
|
||||
super().__init__(ip, api_ver)
|
||||
self.ip = ip
|
||||
self.model = "S19+"
|
||||
self.nominal_chips = 80
|
||||
self.fan_count = 4
|
||||
|
||||
|
||||
class S19ProPlus(AntMiner): # noqa - ignore ABC method implementation
|
||||
def __init__(self, ip: str, api_ver: str = "0.0.0"):
|
||||
super().__init__(ip, api_ver)
|
||||
|
||||
@@ -20,10 +20,12 @@ from .S19 import (
|
||||
S19XP,
|
||||
S19a,
|
||||
S19aPro,
|
||||
S19i,
|
||||
S19j,
|
||||
S19jNoPIC,
|
||||
S19jPro,
|
||||
S19NoPIC,
|
||||
S19Plus,
|
||||
S19Pro,
|
||||
S19ProPlus,
|
||||
)
|
||||
|
||||
35
pyasic/miners/types/whatsminer/M2X/M20P.py
Normal file
35
pyasic/miners/types/whatsminer/M2X/M20P.py
Normal file
@@ -0,0 +1,35 @@
|
||||
# ------------------------------------------------------------------------------
|
||||
# Copyright 2022 Upstream Data Inc -
|
||||
# -
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); -
|
||||
# you may not use this file except in compliance with the License. -
|
||||
# You may obtain a copy of the License at -
|
||||
# -
|
||||
# http://www.apache.org/licenses/LICENSE-2.0 -
|
||||
# -
|
||||
# Unless required by applicable law or agreed to in writing, software -
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, -
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
from pyasic.miners.makes import WhatsMiner
|
||||
|
||||
|
||||
class M20PV10(WhatsMiner): # noqa - ignore ABC method implementation
|
||||
def __init__(self, ip: str, api_ver: str = "0.0.0"):
|
||||
super().__init__(ip, api_ver)
|
||||
self.ip = ip
|
||||
self.model = "M20P V10"
|
||||
self.nominal_chips = 156
|
||||
self.fan_count = 2
|
||||
|
||||
|
||||
class M20PV30(WhatsMiner): # noqa - ignore ABC method implementation
|
||||
def __init__(self, ip: str, api_ver: str = "0.0.0"):
|
||||
super().__init__(ip, api_ver)
|
||||
self.ip = ip
|
||||
self.model = "M20P V30"
|
||||
self.nominal_chips = 148
|
||||
self.fan_count = 2
|
||||
@@ -42,8 +42,5 @@ class M20SV30(WhatsMiner): # noqa - ignore ABC method implementation
|
||||
super().__init__(ip, api_ver)
|
||||
self.ip = ip
|
||||
self.model = "M20S V30"
|
||||
self.nominal_chips = 0
|
||||
warnings.warn(
|
||||
"Unknown chip count for miner type M20SV30, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
|
||||
)
|
||||
self.nominal_chips = 140
|
||||
self.fan_count = 2
|
||||
|
||||
@@ -24,8 +24,5 @@ class M29V10(WhatsMiner): # noqa - ignore ABC method implementation
|
||||
super().__init__(ip, api_ver)
|
||||
self.ip = ip
|
||||
self.model = "M29 V10"
|
||||
self.nominal_chips = 0
|
||||
warnings.warn(
|
||||
"Unknown chip count for miner type M29V10, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
|
||||
)
|
||||
self.nominal_chips = 50
|
||||
self.fan_count = 2
|
||||
|
||||
@@ -15,6 +15,7 @@
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
from .M20 import M20V10
|
||||
from .M20P import M20PV10, M20PV30
|
||||
from .M20S import M20SV10, M20SV20, M20SV30
|
||||
from .M20S_Plus import M20SPlusV30
|
||||
from .M21 import M21V10
|
||||
|
||||
@@ -165,10 +165,7 @@ class M30SPlusVE50(WhatsMiner): # noqa - ignore ABC method implementation
|
||||
super().__init__(ip, api_ver)
|
||||
self.ip = ip
|
||||
self.model = "M30S+ VE50"
|
||||
self.nominal_chips = 0
|
||||
warnings.warn(
|
||||
"Unknown chip count for miner type M30S+ VE50, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
|
||||
)
|
||||
self.nominal_chips = 164
|
||||
self.fan_count = 2
|
||||
|
||||
|
||||
@@ -291,10 +288,7 @@ class M30SPlusVG50(WhatsMiner): # noqa - ignore ABC method implementation
|
||||
super().__init__(ip, api_ver)
|
||||
self.ip = ip
|
||||
self.model = "M30S+ VG50"
|
||||
self.nominal_chips = 0
|
||||
warnings.warn(
|
||||
"Unknown chip count for miner type M30S+ VG50, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
|
||||
)
|
||||
self.nominal_chips = 111
|
||||
self.fan_count = 2
|
||||
|
||||
|
||||
|
||||
26
pyasic/miners/whatsminer/btminer/M2X/M20P.py
Normal file
26
pyasic/miners/whatsminer/btminer/M2X/M20P.py
Normal file
@@ -0,0 +1,26 @@
|
||||
# ------------------------------------------------------------------------------
|
||||
# Copyright 2022 Upstream Data Inc -
|
||||
# -
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); -
|
||||
# you may not use this file except in compliance with the License. -
|
||||
# You may obtain a copy of the License at -
|
||||
# -
|
||||
# http://www.apache.org/licenses/LICENSE-2.0 -
|
||||
# -
|
||||
# Unless required by applicable law or agreed to in writing, software -
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, -
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
from pyasic.miners.backends import M2X
|
||||
from pyasic.miners.types import M20PV10, M20PV30
|
||||
|
||||
|
||||
class BTMinerM20PV10(M2X, M20PV10):
|
||||
pass
|
||||
|
||||
|
||||
class BTMinerM20PV30(M2X, M20PV30):
|
||||
pass
|
||||
@@ -15,6 +15,7 @@
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
from .M20 import BTMinerM20V10
|
||||
from .M20P import BTMinerM20PV10, BTMinerM20PV30
|
||||
from .M20S import BTMinerM20SV10, BTMinerM20SV20, BTMinerM20SV30
|
||||
from .M20S_Plus import BTMinerM20SPlusV30
|
||||
from .M21 import BTMinerM21V10
|
||||
|
||||
@@ -19,9 +19,9 @@ import ipaddress
|
||||
import logging
|
||||
from typing import AsyncIterator, List, Union
|
||||
|
||||
from pyasic import settings
|
||||
from pyasic.miners.miner_factory import AnyMiner, miner_factory
|
||||
from pyasic.network.net_range import MinerNetworkRange
|
||||
from pyasic.settings import PyasicSettings
|
||||
|
||||
|
||||
class MinerNetwork:
|
||||
@@ -108,7 +108,7 @@ class MinerNetwork:
|
||||
# clear cached miners
|
||||
miner_factory.clear_cached_miners()
|
||||
|
||||
limit = asyncio.Semaphore(PyasicSettings().network_scan_threads)
|
||||
limit = asyncio.Semaphore(settings.get("network_scan_threads", 300))
|
||||
miners = await asyncio.gather(
|
||||
*[self.ping_and_get_miner(host, limit) for host in local_network.hosts()]
|
||||
)
|
||||
@@ -136,7 +136,7 @@ class MinerNetwork:
|
||||
local_network = self.get_network()
|
||||
|
||||
# create a list of scan tasks
|
||||
limit = asyncio.Semaphore(PyasicSettings().network_scan_threads)
|
||||
limit = asyncio.Semaphore(settings.get("network_scan_threads", 300))
|
||||
miners = asyncio.as_completed(
|
||||
[
|
||||
loop.create_task(self.ping_and_get_miner(host, limit))
|
||||
@@ -191,12 +191,12 @@ class MinerNetwork:
|
||||
async def ping_miner(
|
||||
ip: ipaddress.ip_address, port=4028
|
||||
) -> Union[None, ipaddress.ip_address]:
|
||||
for i in range(PyasicSettings().network_ping_retries):
|
||||
for i in range(settings.get("network_ping_retries", 1)):
|
||||
try:
|
||||
connection_fut = asyncio.open_connection(str(ip), port)
|
||||
# get the read and write streams from the connection
|
||||
reader, writer = await asyncio.wait_for(
|
||||
connection_fut, timeout=PyasicSettings().network_ping_timeout
|
||||
connection_fut, timeout=settings.get("network_ping_timeout", 3)
|
||||
)
|
||||
# immediately close connection, we know connection happened
|
||||
writer.close()
|
||||
@@ -220,12 +220,12 @@ async def ping_miner(
|
||||
async def ping_and_get_miner(
|
||||
ip: ipaddress.ip_address, port=4028
|
||||
) -> Union[None, AnyMiner]:
|
||||
for i in range(PyasicSettings().network_ping_retries):
|
||||
for i in range(settings.get("network_ping_retries", 1)):
|
||||
try:
|
||||
connection_fut = asyncio.open_connection(str(ip), port)
|
||||
# get the read and write streams from the connection
|
||||
reader, writer = await asyncio.wait_for(
|
||||
connection_fut, timeout=PyasicSettings().network_ping_timeout
|
||||
connection_fut, timeout=settings.get("network_ping_timeout", 3)
|
||||
)
|
||||
# immediately close connection, we know connection happened
|
||||
writer.close()
|
||||
|
||||
@@ -14,27 +14,26 @@
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
from dataclasses import dataclass
|
||||
from typing import Any
|
||||
|
||||
from pyasic.misc import Singleton
|
||||
_settings = { # defaults
|
||||
"network_ping_retries": 1,
|
||||
"network_ping_timeout": 3,
|
||||
"network_scan_threads": 300,
|
||||
"factory_get_retries": 1,
|
||||
"get_data_retries": 1,
|
||||
"default_whatsminer_password": "admin",
|
||||
"default_innosilicon_password": "admin",
|
||||
"default_antminer_password": "root",
|
||||
"default_bosminer_password": "root",
|
||||
"default_vnish_password": "admin",
|
||||
"default_goldshell_password": "123456789",
|
||||
}
|
||||
|
||||
|
||||
@dataclass
|
||||
class PyasicSettings(metaclass=Singleton):
|
||||
network_ping_retries: int = 1
|
||||
network_ping_timeout: int = 3
|
||||
network_scan_threads: int = 300
|
||||
def get(key: str, other: Any = None) -> Any:
|
||||
return _settings.get(key, other)
|
||||
|
||||
miner_factory_get_version_retries: int = 1
|
||||
|
||||
miner_get_data_retries: int = 1
|
||||
|
||||
global_whatsminer_password = "admin"
|
||||
global_innosilicon_password = "admin"
|
||||
global_antminer_password = "root"
|
||||
global_bosminer_password = "root"
|
||||
global_vnish_password = "admin"
|
||||
global_goldshell_password = "123456789"
|
||||
|
||||
debug: bool = False
|
||||
logfile: bool = False
|
||||
def update(key: str, val: Any) -> Any:
|
||||
_settings[key] = val
|
||||
|
||||
@@ -24,7 +24,7 @@ from pyasic.errors import APIWarning
|
||||
class BaseWebAPI(ABC):
|
||||
def __init__(self, ip: str) -> None:
|
||||
# ip address of the miner
|
||||
self.ip = ip # ipaddress.ip_address(ip)
|
||||
self.ip = ip # ipaddress.ip_address(ip)
|
||||
self.username = "root"
|
||||
self.pwd = "root"
|
||||
|
||||
|
||||
@@ -19,14 +19,14 @@ from typing import Union
|
||||
|
||||
import httpx
|
||||
|
||||
from pyasic.settings import PyasicSettings
|
||||
from pyasic import settings
|
||||
from pyasic.web import BaseWebAPI
|
||||
|
||||
|
||||
class AntminerModernWebAPI(BaseWebAPI):
|
||||
def __init__(self, ip: str) -> None:
|
||||
super().__init__(ip)
|
||||
self.pwd = PyasicSettings().global_antminer_password
|
||||
self.pwd = settings.get("default_antminer_password", "root")
|
||||
|
||||
async def send_command(
|
||||
self,
|
||||
@@ -137,7 +137,7 @@ class AntminerModernWebAPI(BaseWebAPI):
|
||||
class AntminerOldWebAPI(BaseWebAPI):
|
||||
def __init__(self, ip: str) -> None:
|
||||
super().__init__(ip)
|
||||
self.pwd = PyasicSettings().global_antminer_password
|
||||
self.pwd = settings.get("default_antminer_password", "root")
|
||||
|
||||
async def send_command(
|
||||
self,
|
||||
|
||||
@@ -1,195 +0,0 @@
|
||||
# ------------------------------------------------------------------------------
|
||||
# Copyright 2022 Upstream Data Inc -
|
||||
# -
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); -
|
||||
# you may not use this file except in compliance with the License. -
|
||||
# You may obtain a copy of the License at -
|
||||
# -
|
||||
# http://www.apache.org/licenses/LICENSE-2.0 -
|
||||
# -
|
||||
# Unless required by applicable law or agreed to in writing, software -
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, -
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
import json
|
||||
from typing import Union
|
||||
|
||||
import httpx
|
||||
|
||||
from pyasic import APIError
|
||||
from pyasic.settings import PyasicSettings
|
||||
from pyasic.web import BaseWebAPI
|
||||
|
||||
|
||||
class BOSMinerWebAPI(BaseWebAPI):
|
||||
def __init__(self, ip: str) -> None:
|
||||
super().__init__(ip)
|
||||
self.pwd = PyasicSettings().global_bosminer_password
|
||||
|
||||
async def send_command(
|
||||
self,
|
||||
command: Union[str, dict],
|
||||
ignore_errors: bool = False,
|
||||
allow_warning: bool = True,
|
||||
**parameters: Union[str, int, bool],
|
||||
) -> dict:
|
||||
if isinstance(command, str):
|
||||
return await self.send_luci_command(command)
|
||||
else:
|
||||
return await self.send_gql_command(command)
|
||||
|
||||
def parse_command(self, graphql_command: Union[dict, set]) -> str:
|
||||
if isinstance(graphql_command, dict):
|
||||
data = []
|
||||
for key in graphql_command:
|
||||
if graphql_command[key] is not None:
|
||||
parsed = self.parse_command(graphql_command[key])
|
||||
data.append(key + parsed)
|
||||
else:
|
||||
data.append(key)
|
||||
else:
|
||||
data = graphql_command
|
||||
return "{" + ",".join(data) + "}"
|
||||
|
||||
async def send_gql_command(
|
||||
self,
|
||||
command: dict,
|
||||
) -> dict:
|
||||
url = f"http://{self.ip}/graphql"
|
||||
query = command
|
||||
if command.get("query") is None:
|
||||
query = {"query": self.parse_command(command)}
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
await self.auth(client)
|
||||
data = await client.post(url, json=query)
|
||||
except httpx.HTTPError:
|
||||
pass
|
||||
else:
|
||||
if data.status_code == 200:
|
||||
try:
|
||||
return data.json()
|
||||
except json.decoder.JSONDecodeError:
|
||||
pass
|
||||
|
||||
async def multicommand(
|
||||
self, *commands: Union[dict, str], allow_warning: bool = True
|
||||
) -> dict:
|
||||
luci_commands = []
|
||||
gql_commands = []
|
||||
for cmd in commands:
|
||||
if isinstance(cmd, dict):
|
||||
gql_commands.append(cmd)
|
||||
if isinstance(cmd, str):
|
||||
luci_commands.append(cmd)
|
||||
|
||||
luci_data = await self.luci_multicommand(*luci_commands)
|
||||
gql_data = await self.gql_multicommand(*gql_commands)
|
||||
|
||||
if gql_data is None:
|
||||
gql_data = {}
|
||||
if luci_data is None:
|
||||
luci_data = {}
|
||||
|
||||
data = dict(**luci_data, **gql_data)
|
||||
return data
|
||||
|
||||
async def luci_multicommand(self, *commands: str) -> dict:
|
||||
data = {}
|
||||
for command in commands:
|
||||
data[command] = await self.send_luci_command(command, ignore_errors=True)
|
||||
return data
|
||||
|
||||
async def gql_multicommand(self, *commands: dict) -> dict:
|
||||
def merge(*d: dict):
|
||||
ret = {}
|
||||
for i in d:
|
||||
if i:
|
||||
for k in i:
|
||||
if not k in ret:
|
||||
ret[k] = i[k]
|
||||
else:
|
||||
ret[k] = merge(ret[k], i[k])
|
||||
return None if ret == {} else ret
|
||||
|
||||
command = merge(*commands)
|
||||
data = await self.send_command(command)
|
||||
if data is not None:
|
||||
if data.get("data") is None:
|
||||
try:
|
||||
commands = list(commands)
|
||||
# noinspection PyTypeChecker
|
||||
commands.remove({"bos": {"faultLight": None}})
|
||||
command = merge(*commands)
|
||||
data = await self.send_gql_command(command)
|
||||
except (LookupError, ValueError):
|
||||
pass
|
||||
if not data:
|
||||
data = {}
|
||||
data["multicommand"] = False
|
||||
return data
|
||||
|
||||
async def auth(self, client: httpx.AsyncClient) -> None:
|
||||
url = f"http://{self.ip}/graphql"
|
||||
await client.post(
|
||||
url,
|
||||
json={
|
||||
"query": 'mutation{auth{login(username:"'
|
||||
+ "root"
|
||||
+ '", password:"'
|
||||
+ self.pwd
|
||||
+ '"){__typename}}}'
|
||||
},
|
||||
)
|
||||
|
||||
async def send_luci_command(self, path: str, ignore_errors: bool = False) -> dict:
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
await self.luci_auth(client)
|
||||
data = await client.get(
|
||||
f"http://{self.ip}{path}", headers={"User-Agent": "BTC Tools v0.1"}
|
||||
)
|
||||
if data.status_code == 200:
|
||||
return data.json()
|
||||
if ignore_errors:
|
||||
return {}
|
||||
raise APIError(
|
||||
f"Web command failed: path={path}, code={data.status_code}"
|
||||
)
|
||||
except (httpx.HTTPError, json.JSONDecodeError):
|
||||
if ignore_errors:
|
||||
return {}
|
||||
raise APIError(f"Web command failed: path={path}")
|
||||
|
||||
async def luci_auth(self, session: httpx.AsyncClient):
|
||||
login = {"luci_username": self.username, "luci_password": self.pwd}
|
||||
url = f"http://{self.ip}/cgi-bin/luci"
|
||||
headers = {
|
||||
"User-Agent": "BTC Tools v0.1", # only seems to respond if this user-agent is set
|
||||
"Content-Type": "application/x-www-form-urlencoded",
|
||||
}
|
||||
await session.post(url, headers=headers, data=login)
|
||||
|
||||
async def get_net_conf(self):
|
||||
return await self.send_luci_command(
|
||||
"/cgi-bin/luci/admin/network/iface_status/lan"
|
||||
)
|
||||
|
||||
async def get_cfg_metadata(self):
|
||||
return await self.send_luci_command("/cgi-bin/luci/admin/miner/cfg_metadata")
|
||||
|
||||
async def get_cfg_data(self):
|
||||
return await self.send_luci_command("/cgi-bin/luci/admin/miner/cfg_data")
|
||||
|
||||
async def get_bos_info(self):
|
||||
return await self.send_luci_command("/cgi-bin/luci/bos/info")
|
||||
|
||||
async def get_overview(self):
|
||||
return await self.send_luci_command(
|
||||
"/cgi-bin/luci/admin/status/overview?status=1"
|
||||
) # needs status=1 or it fails
|
||||
|
||||
async def get_api_status(self):
|
||||
return await self.send_luci_command("/cgi-bin/luci/admin/miner/api_status")
|
||||
626
pyasic/web/bosminer/__init__.py
Normal file
626
pyasic/web/bosminer/__init__.py
Normal file
@@ -0,0 +1,626 @@
|
||||
# ------------------------------------------------------------------------------
|
||||
# Copyright 2022 Upstream Data Inc -
|
||||
# -
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); -
|
||||
# you may not use this file except in compliance with the License. -
|
||||
# You may obtain a copy of the License at -
|
||||
# -
|
||||
# http://www.apache.org/licenses/LICENSE-2.0 -
|
||||
# -
|
||||
# Unless required by applicable law or agreed to in writing, software -
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, -
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
import json
|
||||
from datetime import datetime, timedelta
|
||||
from typing import List, Union
|
||||
|
||||
import grpc_requests
|
||||
import httpx
|
||||
from google.protobuf.message import Message
|
||||
from grpc import RpcError
|
||||
|
||||
from pyasic import APIError, settings
|
||||
from pyasic.web import BaseWebAPI
|
||||
from pyasic.web.bosminer.proto import (
|
||||
get_auth_service_descriptors,
|
||||
get_service_descriptors,
|
||||
)
|
||||
from pyasic.web.bosminer.proto.bos.v1.actions_pb2 import ( # noqa: this will be defined
|
||||
SetLocateDeviceStatusRequest,
|
||||
)
|
||||
from pyasic.web.bosminer.proto.bos.v1.authentication_pb2 import ( # noqa: this will be defined
|
||||
SetPasswordRequest,
|
||||
)
|
||||
from pyasic.web.bosminer.proto.bos.v1.common_pb2 import ( # noqa: this will be defined
|
||||
SaveAction,
|
||||
)
|
||||
from pyasic.web.bosminer.proto.bos.v1.cooling_pb2 import ( # noqa: this will be defined
|
||||
SetImmersionModeRequest,
|
||||
)
|
||||
from pyasic.web.bosminer.proto.bos.v1.miner_pb2 import ( # noqa: this will be defined
|
||||
DisableHashboardsRequest,
|
||||
EnableHashboardsRequest,
|
||||
)
|
||||
from pyasic.web.bosminer.proto.bos.v1.performance_pb2 import ( # noqa: this will be defined
|
||||
DecrementHashrateTargetRequest,
|
||||
DecrementPowerTargetRequest,
|
||||
IncrementHashrateTargetRequest,
|
||||
IncrementPowerTargetRequest,
|
||||
SetDefaultHashrateTargetRequest,
|
||||
SetDefaultPowerTargetRequest,
|
||||
SetHashrateTargetRequest,
|
||||
SetPowerTargetRequest,
|
||||
)
|
||||
|
||||
|
||||
class BOSMinerWebAPI(BaseWebAPI):
|
||||
def __init__(self, ip: str, boser: bool = None) -> None:
|
||||
if boser is None:
|
||||
boser = True
|
||||
|
||||
if boser:
|
||||
self.gql = BOSMinerGQLAPI(
|
||||
ip, settings.get("default_bosminer_password", "root")
|
||||
)
|
||||
self.grpc = BOSMinerGRPCAPI(
|
||||
ip, settings.get("default_bosminer_password", "root")
|
||||
)
|
||||
else:
|
||||
self.gql = None
|
||||
self.grpc = None
|
||||
self.luci = BOSMinerLuCIAPI(
|
||||
ip, settings.get("default_bosminer_password", "root")
|
||||
)
|
||||
|
||||
self._pwd = settings.get("default_bosminer_password", "root")
|
||||
super().__init__(ip)
|
||||
|
||||
@property
|
||||
def pwd(self):
|
||||
return self._pwd
|
||||
|
||||
@pwd.setter
|
||||
def pwd(self, other: str):
|
||||
self._pwd = other
|
||||
self.luci.pwd = other
|
||||
if self.gql is not None:
|
||||
self.gql.pwd = other
|
||||
if self.grpc is not None:
|
||||
self.grpc.pwd = other
|
||||
|
||||
async def send_command(
|
||||
self,
|
||||
command: Union[str, dict],
|
||||
ignore_errors: bool = False,
|
||||
allow_warning: bool = True,
|
||||
**parameters: Union[str, int, bool],
|
||||
) -> dict:
|
||||
if isinstance(command, dict):
|
||||
if self.gql is not None:
|
||||
return await self.gql.send_command(command)
|
||||
elif command.startswith("/cgi-bin/luci"):
|
||||
return await self.gql.send_command(command)
|
||||
else:
|
||||
if self.grpc is not None:
|
||||
return await self.grpc.send_command(command)
|
||||
|
||||
async def multicommand(
|
||||
self, *commands: Union[dict, str], allow_warning: bool = True
|
||||
) -> dict:
|
||||
luci_commands = []
|
||||
gql_commands = []
|
||||
grpc_commands = []
|
||||
for cmd in commands:
|
||||
if isinstance(cmd, dict):
|
||||
gql_commands.append(cmd)
|
||||
elif cmd.startswith("/cgi-bin/luci"):
|
||||
luci_commands.append(cmd)
|
||||
else:
|
||||
grpc_commands.append(cmd)
|
||||
|
||||
luci_data = await self.luci.multicommand(*luci_commands)
|
||||
if self.gql is not None:
|
||||
gql_data = await self.gql.multicommand(*gql_commands)
|
||||
else:
|
||||
gql_data = None
|
||||
if self.grpc is not None:
|
||||
grpc_data = await self.grpc.multicommand(*grpc_commands)
|
||||
else:
|
||||
grpc_data = None
|
||||
|
||||
if gql_data is None:
|
||||
gql_data = {}
|
||||
if luci_data is None:
|
||||
luci_data = {}
|
||||
if grpc_data is None:
|
||||
grpc_data = {}
|
||||
|
||||
data = dict(**luci_data, **gql_data, **grpc_data)
|
||||
return data
|
||||
|
||||
|
||||
class BOSMinerGQLAPI:
|
||||
def __init__(self, ip: str, pwd: str):
|
||||
self.ip = ip
|
||||
self.username = "root"
|
||||
self.pwd = pwd
|
||||
|
||||
async def multicommand(self, *commands: dict) -> dict:
|
||||
def merge(*d: dict):
|
||||
ret = {}
|
||||
for i in d:
|
||||
if i:
|
||||
for k in i:
|
||||
if not k in ret:
|
||||
ret[k] = i[k]
|
||||
else:
|
||||
ret[k] = merge(ret[k], i[k])
|
||||
return None if ret == {} else ret
|
||||
|
||||
command = merge(*commands)
|
||||
data = await self.send_command(command)
|
||||
if data is not None:
|
||||
if data.get("data") is None:
|
||||
try:
|
||||
commands = list(commands)
|
||||
# noinspection PyTypeChecker
|
||||
commands.remove({"bos": {"faultLight": None}})
|
||||
command = merge(*commands)
|
||||
data = await self.send_command(command)
|
||||
except (LookupError, ValueError):
|
||||
pass
|
||||
if not data:
|
||||
data = {}
|
||||
data["multicommand"] = False
|
||||
return data
|
||||
|
||||
async def send_command(
|
||||
self,
|
||||
command: dict,
|
||||
) -> dict:
|
||||
url = f"http://{self.ip}/graphql"
|
||||
query = command
|
||||
if command.get("query") is None:
|
||||
query = {"query": self.parse_command(command)}
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
await self.auth(client)
|
||||
data = await client.post(url, json=query)
|
||||
except httpx.HTTPError:
|
||||
pass
|
||||
else:
|
||||
if data.status_code == 200:
|
||||
try:
|
||||
return data.json()
|
||||
except json.decoder.JSONDecodeError:
|
||||
pass
|
||||
|
||||
def parse_command(self, graphql_command: Union[dict, set]) -> str:
|
||||
if isinstance(graphql_command, dict):
|
||||
data = []
|
||||
for key in graphql_command:
|
||||
if graphql_command[key] is not None:
|
||||
parsed = self.parse_command(graphql_command[key])
|
||||
data.append(key + parsed)
|
||||
else:
|
||||
data.append(key)
|
||||
else:
|
||||
data = graphql_command
|
||||
return "{" + ",".join(data) + "}"
|
||||
|
||||
async def auth(self, client: httpx.AsyncClient) -> None:
|
||||
url = f"http://{self.ip}/graphql"
|
||||
await client.post(
|
||||
url,
|
||||
json={
|
||||
"query": 'mutation{auth{login(username:"'
|
||||
+ "root"
|
||||
+ '", password:"'
|
||||
+ self.pwd
|
||||
+ '"){__typename}}}'
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
class BOSMinerLuCIAPI:
|
||||
def __init__(self, ip: str, pwd: str):
|
||||
self.ip = ip
|
||||
self.username = "root"
|
||||
self.pwd = pwd
|
||||
|
||||
async def multicommand(self, *commands: str) -> dict:
|
||||
data = {}
|
||||
for command in commands:
|
||||
data[command] = await self.send_command(command, ignore_errors=True)
|
||||
return data
|
||||
|
||||
async def send_command(self, path: str, ignore_errors: bool = False) -> dict:
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
await self.auth(client)
|
||||
data = await client.get(
|
||||
f"http://{self.ip}{path}", headers={"User-Agent": "BTC Tools v0.1"}
|
||||
)
|
||||
if data.status_code == 200:
|
||||
return data.json()
|
||||
if ignore_errors:
|
||||
return {}
|
||||
raise APIError(
|
||||
f"Web command failed: path={path}, code={data.status_code}"
|
||||
)
|
||||
except (httpx.HTTPError, json.JSONDecodeError):
|
||||
if ignore_errors:
|
||||
return {}
|
||||
raise APIError(f"Web command failed: path={path}")
|
||||
|
||||
async def auth(self, session: httpx.AsyncClient):
|
||||
login = {"luci_username": self.username, "luci_password": self.pwd}
|
||||
url = f"http://{self.ip}/cgi-bin/luci"
|
||||
headers = {
|
||||
"User-Agent": "BTC Tools v0.1", # only seems to respond if this user-agent is set
|
||||
"Content-Type": "application/x-www-form-urlencoded",
|
||||
}
|
||||
await session.post(url, headers=headers, data=login)
|
||||
|
||||
async def get_net_conf(self):
|
||||
return await self.send_command("/cgi-bin/luci/admin/network/iface_status/lan")
|
||||
|
||||
async def get_cfg_metadata(self):
|
||||
return await self.send_command("/cgi-bin/luci/admin/miner/cfg_metadata")
|
||||
|
||||
async def get_cfg_data(self):
|
||||
return await self.send_command("/cgi-bin/luci/admin/miner/cfg_data")
|
||||
|
||||
async def get_bos_info(self):
|
||||
return await self.send_command("/cgi-bin/luci/bos/info")
|
||||
|
||||
async def get_overview(self):
|
||||
return await self.send_command(
|
||||
"/cgi-bin/luci/admin/status/overview?status=1"
|
||||
) # needs status=1 or it fails
|
||||
|
||||
async def get_api_status(self):
|
||||
return await self.send_command("/cgi-bin/luci/admin/miner/api_status")
|
||||
|
||||
|
||||
class BOSMinerGRPCAPI:
|
||||
def __init__(self, ip: str, pwd: str):
|
||||
self.ip = ip
|
||||
self.username = "root"
|
||||
self.pwd = pwd
|
||||
self._auth = None
|
||||
self._auth_time = datetime.now()
|
||||
|
||||
@property
|
||||
def commands(self) -> list:
|
||||
return self.get_commands()
|
||||
|
||||
def get_commands(self) -> list:
|
||||
return [
|
||||
func
|
||||
for func in
|
||||
# each function in self
|
||||
dir(self)
|
||||
if func
|
||||
not in ["send_command", "multicommand", "auth", "commands", "get_commands"]
|
||||
if callable(getattr(self, func)) and
|
||||
# no __ or _ methods
|
||||
not func.startswith("__") and not func.startswith("_")
|
||||
]
|
||||
|
||||
async def multicommand(self, *commands: str) -> dict:
|
||||
pass
|
||||
|
||||
async def send_command(
|
||||
self,
|
||||
command: str,
|
||||
message: Message = None,
|
||||
ignore_errors: bool = False,
|
||||
auth: bool = True,
|
||||
) -> dict:
|
||||
service, method = command.split("/")
|
||||
metadata = []
|
||||
if auth:
|
||||
metadata.append(("authorization", await self.auth()))
|
||||
async with grpc_requests.StubAsyncClient(
|
||||
f"{self.ip}:50051", service_descriptors=get_service_descriptors()
|
||||
) as client:
|
||||
await client.register_all_service()
|
||||
try:
|
||||
return await client.request(
|
||||
service,
|
||||
method,
|
||||
request=message,
|
||||
metadata=metadata,
|
||||
)
|
||||
except RpcError as e:
|
||||
if ignore_errors:
|
||||
return {}
|
||||
raise APIError(e._details)
|
||||
|
||||
async def auth(self):
|
||||
if self._auth is not None and self._auth_time - datetime.now() < timedelta(
|
||||
seconds=3540
|
||||
):
|
||||
return self._auth
|
||||
await self._get_auth()
|
||||
return self._auth
|
||||
|
||||
async def _get_auth(self):
|
||||
async with grpc_requests.StubAsyncClient(
|
||||
f"{self.ip}:50051", service_descriptors=get_auth_service_descriptors()
|
||||
) as client:
|
||||
await client.register_all_service()
|
||||
method_meta = client.get_method_meta(
|
||||
"braiins.bos.v1.AuthenticationService", "Login"
|
||||
)
|
||||
_request = method_meta.method_type.request_parser(
|
||||
{"username": self.username, "password": self.pwd},
|
||||
method_meta.input_type,
|
||||
)
|
||||
metadata = await method_meta.handler(_request).initial_metadata()
|
||||
|
||||
for key, value in metadata:
|
||||
if key == "authorization":
|
||||
self._auth = value
|
||||
self._auth_time = datetime.now()
|
||||
return self._auth
|
||||
|
||||
async def get_api_version(self):
|
||||
return await self.send_command(
|
||||
"braiins.bos.ApiVersionService/GetApiVersion", auth=False
|
||||
)
|
||||
|
||||
async def start(self):
|
||||
return await self.send_command("braiins.bos.v1.ActionsService/Start")
|
||||
|
||||
async def stop(self):
|
||||
return await self.send_command("braiins.bos.v1.ActionsService/Stop")
|
||||
|
||||
async def pause_mining(self):
|
||||
return await self.send_command("braiins.bos.v1.ActionsService/PauseMining")
|
||||
|
||||
async def resume_mining(self):
|
||||
return await self.send_command("braiins.bos.v1.ActionsService/ResumeMining")
|
||||
|
||||
async def restart(self):
|
||||
return await self.send_command("braiins.bos.v1.ActionsService/Restart")
|
||||
|
||||
async def reboot(self):
|
||||
return await self.send_command("braiins.bos.v1.ActionsService/Reboot")
|
||||
|
||||
async def set_locate_device_status(self, enable: bool):
|
||||
message = SetLocateDeviceStatusRequest()
|
||||
message.enable = enable
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.ActionsService/SetLocateDeviceStatus", message=message
|
||||
)
|
||||
|
||||
async def get_locate_device_status(self):
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.ActionsService/GetLocateDeviceStatus"
|
||||
)
|
||||
|
||||
async def set_password(self, password: str = None):
|
||||
message = SetPasswordRequest()
|
||||
if password:
|
||||
message.password = password
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.AuthenticationService/SetPassword", message=message
|
||||
)
|
||||
|
||||
async def get_cooling_state(self):
|
||||
return await self.send_command("braiins.bos.v1.CoolingService/GetCoolingState")
|
||||
|
||||
async def set_immersion_mode(
|
||||
self,
|
||||
enable: bool,
|
||||
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
|
||||
):
|
||||
message = SetImmersionModeRequest()
|
||||
message.enable = enable
|
||||
message.save_action = save_action
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.CoolingService/SetImmersionMode", message=message
|
||||
)
|
||||
|
||||
async def get_tuner_state(self):
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.PerformanceService/GetTunerState"
|
||||
)
|
||||
|
||||
async def list_target_profiles(self):
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.PerformanceService/ListTargetProfiles"
|
||||
)
|
||||
|
||||
async def set_default_power_target(
|
||||
self, save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY
|
||||
):
|
||||
message = SetDefaultPowerTargetRequest()
|
||||
message.save_action = save_action
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.PerformanceService/SetDefaultPowerTarget", message=message
|
||||
)
|
||||
|
||||
async def set_power_target(
|
||||
self,
|
||||
power_target: int,
|
||||
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
|
||||
):
|
||||
message = SetPowerTargetRequest()
|
||||
message.power_target.watt = power_target
|
||||
message.save_action = save_action
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.PerformanceService/SetPowerTarget", message=message
|
||||
)
|
||||
|
||||
async def increment_power_target(
|
||||
self,
|
||||
power_target_increment: int,
|
||||
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
|
||||
):
|
||||
message = IncrementPowerTargetRequest()
|
||||
message.power_target_increment.watt = power_target_increment
|
||||
message.save_action = save_action
|
||||
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.PerformanceService/IncrementPowerTarget", message=message
|
||||
)
|
||||
|
||||
async def decrement_power_target(
|
||||
self,
|
||||
power_target_decrement: int,
|
||||
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
|
||||
):
|
||||
message = DecrementPowerTargetRequest()
|
||||
message.power_target_decrement.watt = power_target_decrement
|
||||
message.save_action = save_action
|
||||
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.PerformanceService/DecrementPowerTarget",
|
||||
message=message,
|
||||
)
|
||||
|
||||
async def set_default_hashrate_target(
|
||||
self, save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY
|
||||
):
|
||||
message = SetDefaultHashrateTargetRequest()
|
||||
message.save_action = save_action
|
||||
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.PerformanceService/SetDefaultHashrateTarget",
|
||||
message=message,
|
||||
)
|
||||
|
||||
async def set_hashrate_target(
|
||||
self,
|
||||
hashrate_target: int,
|
||||
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
|
||||
):
|
||||
message = SetHashrateTargetRequest()
|
||||
message.hashrate_target.terahash_per_second = hashrate_target
|
||||
message.save_action = save_action
|
||||
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.PerformanceService/SetHashrateTarget", message=message
|
||||
)
|
||||
|
||||
async def increment_hashrate_target(
|
||||
self,
|
||||
hashrate_target_increment: int,
|
||||
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
|
||||
):
|
||||
message = IncrementHashrateTargetRequest()
|
||||
message.hashrate_target_increment.terahash_per_second = (
|
||||
hashrate_target_increment
|
||||
)
|
||||
message.save_action = save_action
|
||||
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.PerformanceService/IncrementHashrateTarget",
|
||||
message=message,
|
||||
)
|
||||
|
||||
async def decrement_hashrate_target(
|
||||
self,
|
||||
hashrate_target_decrement: int,
|
||||
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
|
||||
):
|
||||
message = DecrementHashrateTargetRequest()
|
||||
message.hashrate_target_decrement.terahash_per_second = (
|
||||
hashrate_target_decrement
|
||||
)
|
||||
message.save_action = save_action
|
||||
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.PerformanceService/DecrementHashrateTarget",
|
||||
message=message,
|
||||
)
|
||||
|
||||
async def set_dps(self):
|
||||
raise NotImplementedError
|
||||
return await self.send_command("braiins.bos.v1.PerformanceService/SetDPS")
|
||||
|
||||
async def set_performance_mode(self):
|
||||
raise NotImplementedError
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.PerformanceService/SetPerformanceMode"
|
||||
)
|
||||
|
||||
async def get_active_performance_mode(self):
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.PerformanceService/GetActivePerformanceMode"
|
||||
)
|
||||
|
||||
async def get_pool_groups(self):
|
||||
return await self.send_command("braiins.bos.v1.PoolService/GetPoolGroups")
|
||||
|
||||
async def create_pool_group(self):
|
||||
raise NotImplementedError
|
||||
return await self.send_command("braiins.bos.v1.PoolService/CreatePoolGroup")
|
||||
|
||||
async def update_pool_group(self):
|
||||
raise NotImplementedError
|
||||
return await self.send_command("braiins.bos.v1.PoolService/UpdatePoolGroup")
|
||||
|
||||
async def remove_pool_group(self):
|
||||
raise NotImplementedError
|
||||
return await self.send_command("braiins.bos.v1.PoolService/RemovePoolGroup")
|
||||
|
||||
async def get_miner_configuration(self):
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.ConfigurationService/GetMinerConfiguration"
|
||||
)
|
||||
|
||||
async def get_constraints(self):
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.ConfigurationService/GetConstraints"
|
||||
)
|
||||
|
||||
async def get_license_state(self):
|
||||
return await self.send_command("braiins.bos.v1.LicenseService/GetLicenseState")
|
||||
|
||||
async def get_miner_status(self):
|
||||
return await self.send_command("braiins.bos.v1.MinerService/GetMinerStatus")
|
||||
|
||||
async def get_miner_details(self):
|
||||
return await self.send_command("braiins.bos.v1.MinerService/GetMinerDetails")
|
||||
|
||||
async def get_miner_stats(self):
|
||||
return await self.send_command("braiins.bos.v1.MinerService/GetMinerStats")
|
||||
|
||||
async def get_hashboards(self):
|
||||
return await self.send_command("braiins.bos.v1.MinerService/GetHashboards")
|
||||
|
||||
async def get_support_archive(self):
|
||||
return await self.send_command("braiins.bos.v1.MinerService/GetSupportArchive")
|
||||
|
||||
async def enable_hashboards(
|
||||
self,
|
||||
hashboard_ids: List[str],
|
||||
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
|
||||
):
|
||||
message = EnableHashboardsRequest()
|
||||
message.hashboard_ids[:] = hashboard_ids
|
||||
message.save_action = save_action
|
||||
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.MinerService/EnableHashboards", message=message
|
||||
)
|
||||
|
||||
async def disable_hashboards(
|
||||
self,
|
||||
hashboard_ids: List[str],
|
||||
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
|
||||
):
|
||||
message = DisableHashboardsRequest()
|
||||
message.hashboard_ids[:] = hashboard_ids
|
||||
message.save_action = save_action
|
||||
|
||||
return await self.send_command(
|
||||
"braiins.bos.v1.MinerService/DisableHashboards", message=message
|
||||
)
|
||||
54
pyasic/web/bosminer/proto/__init__.py
Normal file
54
pyasic/web/bosminer/proto/__init__.py
Normal file
@@ -0,0 +1,54 @@
|
||||
# ------------------------------------------------------------------------------
|
||||
# Copyright 2022 Upstream Data Inc -
|
||||
# -
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); -
|
||||
# you may not use this file except in compliance with the License. -
|
||||
# You may obtain a copy of the License at -
|
||||
# -
|
||||
# http://www.apache.org/licenses/LICENSE-2.0 -
|
||||
# -
|
||||
# Unless required by applicable law or agreed to in writing, software -
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, -
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
from __future__ import annotations
|
||||
|
||||
from .bos import version_pb2
|
||||
from .bos.v1 import (
|
||||
actions_pb2,
|
||||
authentication_pb2,
|
||||
common_pb2,
|
||||
configuration_pb2,
|
||||
constraints_pb2,
|
||||
cooling_pb2,
|
||||
license_pb2,
|
||||
miner_pb2,
|
||||
performance_pb2,
|
||||
pool_pb2,
|
||||
units_pb2,
|
||||
work_pb2,
|
||||
)
|
||||
|
||||
|
||||
def get_service_descriptors():
|
||||
return [
|
||||
*version_pb2.DESCRIPTOR.services_by_name.values(),
|
||||
*authentication_pb2.DESCRIPTOR.services_by_name.values(),
|
||||
*actions_pb2.DESCRIPTOR.services_by_name.values(),
|
||||
*common_pb2.DESCRIPTOR.services_by_name.values(),
|
||||
*configuration_pb2.DESCRIPTOR.services_by_name.values(),
|
||||
*constraints_pb2.DESCRIPTOR.services_by_name.values(),
|
||||
*cooling_pb2.DESCRIPTOR.services_by_name.values(),
|
||||
*license_pb2.DESCRIPTOR.services_by_name.values(),
|
||||
*miner_pb2.DESCRIPTOR.services_by_name.values(),
|
||||
*performance_pb2.DESCRIPTOR.services_by_name.values(),
|
||||
*pool_pb2.DESCRIPTOR.services_by_name.values(),
|
||||
*units_pb2.DESCRIPTOR.services_by_name.values(),
|
||||
*work_pb2.DESCRIPTOR.services_by_name.values(),
|
||||
]
|
||||
|
||||
|
||||
def get_auth_service_descriptors():
|
||||
return authentication_pb2.DESCRIPTOR.services_by_name.values()
|
||||
15
pyasic/web/bosminer/proto/bos/__init__.py
Normal file
15
pyasic/web/bosminer/proto/bos/__init__.py
Normal file
@@ -0,0 +1,15 @@
|
||||
# ------------------------------------------------------------------------------
|
||||
# Copyright 2022 Upstream Data Inc -
|
||||
# -
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); -
|
||||
# you may not use this file except in compliance with the License. -
|
||||
# You may obtain a copy of the License at -
|
||||
# -
|
||||
# http://www.apache.org/licenses/LICENSE-2.0 -
|
||||
# -
|
||||
# Unless required by applicable law or agreed to in writing, software -
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, -
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
15
pyasic/web/bosminer/proto/bos/v1/__init__.py
Normal file
15
pyasic/web/bosminer/proto/bos/v1/__init__.py
Normal file
@@ -0,0 +1,15 @@
|
||||
# ------------------------------------------------------------------------------
|
||||
# Copyright 2022 Upstream Data Inc -
|
||||
# -
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); -
|
||||
# you may not use this file except in compliance with the License. -
|
||||
# You may obtain a copy of the License at -
|
||||
# -
|
||||
# http://www.apache.org/licenses/LICENSE-2.0 -
|
||||
# -
|
||||
# Unless required by applicable law or agreed to in writing, software -
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, -
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
23
pyasic/web/bosminer/proto/bos/v1/actions.py
Normal file
23
pyasic/web/bosminer/proto/bos/v1/actions.py
Normal file
@@ -0,0 +1,23 @@
|
||||
# ------------------------------------------------------------------------------
|
||||
# Copyright 2022 Upstream Data Inc -
|
||||
# -
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); -
|
||||
# you may not use this file except in compliance with the License. -
|
||||
# You may obtain a copy of the License at -
|
||||
# -
|
||||
# http://www.apache.org/licenses/LICENSE-2.0 -
|
||||
# -
|
||||
# Unless required by applicable law or agreed to in writing, software -
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, -
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
from enum import Enum
|
||||
|
||||
|
||||
class SaveAction(Enum):
|
||||
UNSPECIFIED = "SaveAction.SAVE_ACTION_UNSPECIFIED"
|
||||
SAVE = "SaveAction.SAVE_ACTION_SAVE"
|
||||
SAVE_AND_APPLY = "SaveAction.SAVE_ACTION_SAVE_AND_APPLY"
|
||||
SAVE_AND_FORCE_APPLY = "SaveAction.SAVE_ACTION_SAVE_AND_FORCE_APPLY"
|
||||
56
pyasic/web/bosminer/proto/bos/v1/actions_pb2.py
Normal file
56
pyasic/web/bosminer/proto/bos/v1/actions_pb2.py
Normal file
@@ -0,0 +1,56 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: bos/v1/actions.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
from google.protobuf.internal import builder as _builder
|
||||
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
|
||||
b'\n\x14\x62os/v1/actions.proto\x12\x0e\x62raiins.bos.v1"\x0e\n\x0cStartRequest"(\n\rStartResponse\x12\x17\n\x0f\x61lready_running\x18\x01 \x01(\x08"\x10\n\x0eRestartRequest"*\n\x0fRestartResponse\x12\x17\n\x0f\x61lready_running\x18\x01 \x01(\x08"\x0f\n\rRebootRequest"\x10\n\x0eRebootResponse"\r\n\x0bStopRequest"\'\n\x0cStopResponse\x12\x17\n\x0f\x61lready_stopped\x18\x01 \x01(\x08"\x14\n\x12PauseMiningRequest"-\n\x13PauseMiningResponse\x12\x16\n\x0e\x61lready_paused\x18\x01 \x01(\x08"\x15\n\x13ResumeMiningRequest".\n\x14ResumeMiningResponse\x12\x16\n\x0e\x61lready_mining\x18\x01 \x01(\x08".\n\x1cSetLocateDeviceStatusRequest\x12\x0e\n\x06\x65nable\x18\x01 \x01(\x08"-\n\x1aLocateDeviceStatusResponse\x12\x0f\n\x07\x65nabled\x18\x01 \x01(\x08"\x1e\n\x1cGetLocateDeviceStatusRequest2\xc7\x05\n\x0e\x41\x63tionsService\x12\x44\n\x05Start\x12\x1c.braiins.bos.v1.StartRequest\x1a\x1d.braiins.bos.v1.StartResponse\x12\x41\n\x04Stop\x12\x1b.braiins.bos.v1.StopRequest\x1a\x1c.braiins.bos.v1.StopResponse\x12V\n\x0bPauseMining\x12".braiins.bos.v1.PauseMiningRequest\x1a#.braiins.bos.v1.PauseMiningResponse\x12Y\n\x0cResumeMining\x12#.braiins.bos.v1.ResumeMiningRequest\x1a$.braiins.bos.v1.ResumeMiningResponse\x12J\n\x07Restart\x12\x1e.braiins.bos.v1.RestartRequest\x1a\x1f.braiins.bos.v1.RestartResponse\x12G\n\x06Reboot\x12\x1d.braiins.bos.v1.RebootRequest\x1a\x1e.braiins.bos.v1.RebootResponse\x12q\n\x15SetLocateDeviceStatus\x12,.braiins.bos.v1.SetLocateDeviceStatusRequest\x1a*.braiins.bos.v1.LocateDeviceStatusResponse\x12q\n\x15GetLocateDeviceStatus\x12,.braiins.bos.v1.GetLocateDeviceStatusRequest\x1a*.braiins.bos.v1.LocateDeviceStatusResponseb\x06proto3'
|
||||
)
|
||||
|
||||
_globals = globals()
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.actions_pb2", _globals)
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
DESCRIPTOR._options = None
|
||||
_globals["_STARTREQUEST"]._serialized_start = 40
|
||||
_globals["_STARTREQUEST"]._serialized_end = 54
|
||||
_globals["_STARTRESPONSE"]._serialized_start = 56
|
||||
_globals["_STARTRESPONSE"]._serialized_end = 96
|
||||
_globals["_RESTARTREQUEST"]._serialized_start = 98
|
||||
_globals["_RESTARTREQUEST"]._serialized_end = 114
|
||||
_globals["_RESTARTRESPONSE"]._serialized_start = 116
|
||||
_globals["_RESTARTRESPONSE"]._serialized_end = 158
|
||||
_globals["_REBOOTREQUEST"]._serialized_start = 160
|
||||
_globals["_REBOOTREQUEST"]._serialized_end = 175
|
||||
_globals["_REBOOTRESPONSE"]._serialized_start = 177
|
||||
_globals["_REBOOTRESPONSE"]._serialized_end = 193
|
||||
_globals["_STOPREQUEST"]._serialized_start = 195
|
||||
_globals["_STOPREQUEST"]._serialized_end = 208
|
||||
_globals["_STOPRESPONSE"]._serialized_start = 210
|
||||
_globals["_STOPRESPONSE"]._serialized_end = 249
|
||||
_globals["_PAUSEMININGREQUEST"]._serialized_start = 251
|
||||
_globals["_PAUSEMININGREQUEST"]._serialized_end = 271
|
||||
_globals["_PAUSEMININGRESPONSE"]._serialized_start = 273
|
||||
_globals["_PAUSEMININGRESPONSE"]._serialized_end = 318
|
||||
_globals["_RESUMEMININGREQUEST"]._serialized_start = 320
|
||||
_globals["_RESUMEMININGREQUEST"]._serialized_end = 341
|
||||
_globals["_RESUMEMININGRESPONSE"]._serialized_start = 343
|
||||
_globals["_RESUMEMININGRESPONSE"]._serialized_end = 389
|
||||
_globals["_SETLOCATEDEVICESTATUSREQUEST"]._serialized_start = 391
|
||||
_globals["_SETLOCATEDEVICESTATUSREQUEST"]._serialized_end = 437
|
||||
_globals["_LOCATEDEVICESTATUSRESPONSE"]._serialized_start = 439
|
||||
_globals["_LOCATEDEVICESTATUSRESPONSE"]._serialized_end = 484
|
||||
_globals["_GETLOCATEDEVICESTATUSREQUEST"]._serialized_start = 486
|
||||
_globals["_GETLOCATEDEVICESTATUSREQUEST"]._serialized_end = 516
|
||||
_globals["_ACTIONSSERVICE"]._serialized_start = 519
|
||||
_globals["_ACTIONSSERVICE"]._serialized_end = 1230
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
36
pyasic/web/bosminer/proto/bos/v1/authentication_pb2.py
Normal file
36
pyasic/web/bosminer/proto/bos/v1/authentication_pb2.py
Normal file
@@ -0,0 +1,36 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: bos/v1/authentication.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
from google.protobuf.internal import builder as _builder
|
||||
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
|
||||
b'\n\x1b\x62os/v1/authentication.proto\x12\x0e\x62raiins.bos.v1"2\n\x0cLoginRequest\x12\x10\n\x08username\x18\x01 \x01(\t\x12\x10\n\x08password\x18\x02 \x01(\t"\x0f\n\rLoginResponse"8\n\x12SetPasswordRequest\x12\x15\n\x08password\x18\x01 \x01(\tH\x00\x88\x01\x01\x42\x0b\n\t_password"\x15\n\x13SetPasswordResponse2\xb5\x01\n\x15\x41uthenticationService\x12\x44\n\x05Login\x12\x1c.braiins.bos.v1.LoginRequest\x1a\x1d.braiins.bos.v1.LoginResponse\x12V\n\x0bSetPassword\x12".braiins.bos.v1.SetPasswordRequest\x1a#.braiins.bos.v1.SetPasswordResponseb\x06proto3'
|
||||
)
|
||||
|
||||
_globals = globals()
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
|
||||
_builder.BuildTopDescriptorsAndMessages(
|
||||
DESCRIPTOR, "bos.v1.authentication_pb2", _globals
|
||||
)
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
DESCRIPTOR._options = None
|
||||
_globals["_LOGINREQUEST"]._serialized_start = 47
|
||||
_globals["_LOGINREQUEST"]._serialized_end = 97
|
||||
_globals["_LOGINRESPONSE"]._serialized_start = 99
|
||||
_globals["_LOGINRESPONSE"]._serialized_end = 114
|
||||
_globals["_SETPASSWORDREQUEST"]._serialized_start = 116
|
||||
_globals["_SETPASSWORDREQUEST"]._serialized_end = 172
|
||||
_globals["_SETPASSWORDRESPONSE"]._serialized_start = 174
|
||||
_globals["_SETPASSWORDRESPONSE"]._serialized_end = 195
|
||||
_globals["_AUTHENTICATIONSERVICE"]._serialized_start = 198
|
||||
_globals["_AUTHENTICATIONSERVICE"]._serialized_end = 379
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
26
pyasic/web/bosminer/proto/bos/v1/common_pb2.py
Normal file
26
pyasic/web/bosminer/proto/bos/v1/common_pb2.py
Normal file
@@ -0,0 +1,26 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: bos/v1/common.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
from google.protobuf.internal import builder as _builder
|
||||
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
|
||||
b"\n\x13\x62os/v1/common.proto\x12\x0e\x62raiins.bos.v1*\x85\x01\n\nSaveAction\x12\x1b\n\x17SAVE_ACTION_UNSPECIFIED\x10\x00\x12\x14\n\x10SAVE_ACTION_SAVE\x10\x01\x12\x1e\n\x1aSAVE_ACTION_SAVE_AND_APPLY\x10\x02\x12$\n SAVE_ACTION_SAVE_AND_FORCE_APPLY\x10\x03\x62\x06proto3"
|
||||
)
|
||||
|
||||
_globals = globals()
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.common_pb2", _globals)
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
DESCRIPTOR._options = None
|
||||
_globals["_SAVEACTION"]._serialized_start = 40
|
||||
_globals["_SAVEACTION"]._serialized_end = 173
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
40
pyasic/web/bosminer/proto/bos/v1/configuration_pb2.py
Normal file
40
pyasic/web/bosminer/proto/bos/v1/configuration_pb2.py
Normal file
@@ -0,0 +1,40 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: bos/v1/configuration.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
from google.protobuf.internal import builder as _builder
|
||||
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
from ...bos.v1 import cooling_pb2 as bos_dot_v1_dot_cooling__pb2
|
||||
from ...bos.v1 import performance_pb2 as bos_dot_v1_dot_performance__pb2
|
||||
from ...bos.v1 import pool_pb2 as bos_dot_v1_dot_pool__pb2
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
|
||||
b'\n\x1a\x62os/v1/configuration.proto\x12\x0e\x62raiins.bos.v1\x1a\x14\x62os/v1/cooling.proto\x1a\x18\x62os/v1/performance.proto\x1a\x11\x62os/v1/pool.proto"\x1e\n\x1cGetMinerConfigurationRequest"\xc6\x02\n\x1dGetMinerConfigurationResponse\x12;\n\x0bpool_groups\x18\x01 \x03(\x0b\x32&.braiins.bos.v1.PoolGroupConfiguration\x12\x39\n\x0btemperature\x18\x02 \x01(\x0b\x32$.braiins.bos.v1.CoolingConfiguration\x12\x31\n\x05tuner\x18\x03 \x01(\x0b\x32".braiins.bos.v1.TunerConfiguration\x12-\n\x03\x64ps\x18\x04 \x01(\x0b\x32 .braiins.bos.v1.DPSConfiguration\x12K\n\x10hashboard_config\x18\x05 \x01(\x0b\x32\x31.braiins.bos.v1.HashboardPerformanceConfiguration"\x17\n\x15GetConstraintsRequest"\x95\x02\n\x16GetConstraintsResponse\x12;\n\x11tuner_constraints\x18\x01 \x01(\x0b\x32 .braiins.bos.v1.TunerConstraints\x12?\n\x13\x63ooling_constraints\x18\x02 \x01(\x0b\x32".braiins.bos.v1.CoolingConstraints\x12\x37\n\x0f\x64ps_constraints\x18\x03 \x01(\x0b\x32\x1e.braiins.bos.v1.DPSConstraints\x12\x44\n\x16hashboards_constraints\x18\x04 \x01(\x0b\x32$.braiins.bos.v1.HashboardConstraints2\xed\x01\n\x14\x43onfigurationService\x12t\n\x15GetMinerConfiguration\x12,.braiins.bos.v1.GetMinerConfigurationRequest\x1a-.braiins.bos.v1.GetMinerConfigurationResponse\x12_\n\x0eGetConstraints\x12%.braiins.bos.v1.GetConstraintsRequest\x1a&.braiins.bos.v1.GetConstraintsResponseb\x06proto3'
|
||||
)
|
||||
|
||||
_globals = globals()
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
|
||||
_builder.BuildTopDescriptorsAndMessages(
|
||||
DESCRIPTOR, "bos.v1.configuration_pb2", _globals
|
||||
)
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
DESCRIPTOR._options = None
|
||||
_globals["_GETMINERCONFIGURATIONREQUEST"]._serialized_start = 113
|
||||
_globals["_GETMINERCONFIGURATIONREQUEST"]._serialized_end = 143
|
||||
_globals["_GETMINERCONFIGURATIONRESPONSE"]._serialized_start = 146
|
||||
_globals["_GETMINERCONFIGURATIONRESPONSE"]._serialized_end = 472
|
||||
_globals["_GETCONSTRAINTSREQUEST"]._serialized_start = 474
|
||||
_globals["_GETCONSTRAINTSREQUEST"]._serialized_end = 497
|
||||
_globals["_GETCONSTRAINTSRESPONSE"]._serialized_start = 500
|
||||
_globals["_GETCONSTRAINTSRESPONSE"]._serialized_end = 777
|
||||
_globals["_CONFIGURATIONSERVICE"]._serialized_start = 780
|
||||
_globals["_CONFIGURATIONSERVICE"]._serialized_end = 1017
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
44
pyasic/web/bosminer/proto/bos/v1/constraints_pb2.py
Normal file
44
pyasic/web/bosminer/proto/bos/v1/constraints_pb2.py
Normal file
@@ -0,0 +1,44 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: bos/v1/constraints.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
from google.protobuf.internal import builder as _builder
|
||||
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
from ...bos.v1 import units_pb2 as bos_dot_v1_dot_units__pb2
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
|
||||
b'\n\x18\x62os/v1/constraints.proto\x12\x0e\x62raiins.bos.v1\x1a\x12\x62os/v1/units.proto">\n\x11UInt32Constraints\x12\x0f\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\r\x12\x0b\n\x03min\x18\x02 \x01(\r\x12\x0b\n\x03max\x18\x03 \x01(\r">\n\x11\x44oubleConstraints\x12\x0f\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x01\x12\x0b\n\x03min\x18\x02 \x01(\x01\x12\x0b\n\x03max\x18\x03 \x01(\x01"\x82\x01\n\x10PowerConstraints\x12&\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x0b\x32\x15.braiins.bos.v1.Power\x12"\n\x03min\x18\x02 \x01(\x0b\x32\x15.braiins.bos.v1.Power\x12"\n\x03max\x18\x03 \x01(\x0b\x32\x15.braiins.bos.v1.Power"\x9a\x01\n\x13HashrateConstraints\x12-\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x0b\x32\x1c.braiins.bos.v1.TeraHashrate\x12)\n\x03min\x18\x02 \x01(\x0b\x32\x1c.braiins.bos.v1.TeraHashrate\x12)\n\x03max\x18\x03 \x01(\x0b\x32\x1c.braiins.bos.v1.TeraHashrate"\x9a\x01\n\x16TemperatureConstraints\x12,\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature\x12(\n\x03min\x18\x02 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature\x12(\n\x03max\x18\x03 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature"$\n\x11\x42ooleanConstraint\x12\x0f\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x08"\x85\x01\n\x13\x44urationConstraints\x12&\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x0b\x32\x15.braiins.bos.v1.Hours\x12"\n\x03min\x18\x02 \x01(\x0b\x32\x15.braiins.bos.v1.Hours\x12"\n\x03max\x18\x03 \x01(\x0b\x32\x15.braiins.bos.v1.Hours"\x92\x01\n\x14\x46requencyConstraints\x12*\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x0b\x32\x19.braiins.bos.v1.Frequency\x12&\n\x03min\x18\x02 \x01(\x0b\x32\x19.braiins.bos.v1.Frequency\x12&\n\x03max\x18\x03 \x01(\x0b\x32\x19.braiins.bos.v1.Frequency"\x8a\x01\n\x12VoltageConstraints\x12(\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x0b\x32\x17.braiins.bos.v1.Voltage\x12$\n\x03min\x18\x02 \x01(\x0b\x32\x17.braiins.bos.v1.Voltage\x12$\n\x03max\x18\x03 \x01(\x0b\x32\x17.braiins.bos.v1.Voltageb\x06proto3'
|
||||
)
|
||||
|
||||
_globals = globals()
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.constraints_pb2", _globals)
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
DESCRIPTOR._options = None
|
||||
_globals["_UINT32CONSTRAINTS"]._serialized_start = 64
|
||||
_globals["_UINT32CONSTRAINTS"]._serialized_end = 126
|
||||
_globals["_DOUBLECONSTRAINTS"]._serialized_start = 128
|
||||
_globals["_DOUBLECONSTRAINTS"]._serialized_end = 190
|
||||
_globals["_POWERCONSTRAINTS"]._serialized_start = 193
|
||||
_globals["_POWERCONSTRAINTS"]._serialized_end = 323
|
||||
_globals["_HASHRATECONSTRAINTS"]._serialized_start = 326
|
||||
_globals["_HASHRATECONSTRAINTS"]._serialized_end = 480
|
||||
_globals["_TEMPERATURECONSTRAINTS"]._serialized_start = 483
|
||||
_globals["_TEMPERATURECONSTRAINTS"]._serialized_end = 637
|
||||
_globals["_BOOLEANCONSTRAINT"]._serialized_start = 639
|
||||
_globals["_BOOLEANCONSTRAINT"]._serialized_end = 675
|
||||
_globals["_DURATIONCONSTRAINTS"]._serialized_start = 678
|
||||
_globals["_DURATIONCONSTRAINTS"]._serialized_end = 811
|
||||
_globals["_FREQUENCYCONSTRAINTS"]._serialized_start = 814
|
||||
_globals["_FREQUENCYCONSTRAINTS"]._serialized_end = 960
|
||||
_globals["_VOLTAGECONSTRAINTS"]._serialized_start = 963
|
||||
_globals["_VOLTAGECONSTRAINTS"]._serialized_end = 1101
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
56
pyasic/web/bosminer/proto/bos/v1/cooling_pb2.py
Normal file
56
pyasic/web/bosminer/proto/bos/v1/cooling_pb2.py
Normal file
@@ -0,0 +1,56 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: bos/v1/cooling.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
from google.protobuf.internal import builder as _builder
|
||||
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
from ...bos.v1 import common_pb2 as bos_dot_v1_dot_common__pb2
|
||||
from ...bos.v1 import constraints_pb2 as bos_dot_v1_dot_constraints__pb2
|
||||
from ...bos.v1 import units_pb2 as bos_dot_v1_dot_units__pb2
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
|
||||
b'\n\x14\x62os/v1/cooling.proto\x12\x0e\x62raiins.bos.v1\x1a\x13\x62os/v1/common.proto\x1a\x18\x62os/v1/constraints.proto\x1a\x12\x62os/v1/units.proto"\xbc\x01\n\x0f\x43oolingAutoMode\x12\x37\n\x12target_temperature\x18\x01 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature\x12\x34\n\x0fhot_temperature\x18\x02 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature\x12:\n\x15\x64\x61ngerous_temperature\x18\x03 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature"\xb7\x01\n\x11\x43oolingManualMode\x12\x1c\n\x0f\x66\x61n_speed_ratio\x18\x01 \x01(\x01H\x00\x88\x01\x01\x12\x34\n\x0fhot_temperature\x18\x02 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature\x12:\n\x15\x64\x61ngerous_temperature\x18\x03 \x01(\x0b\x32\x1b.braiins.bos.v1.TemperatureB\x12\n\x10_fan_speed_ratio"G\n\x13\x43oolingDisabledMode\x12\x1c\n\x0f\x66\x61n_speed_ratio\x18\x01 \x01(\x01H\x00\x88\x01\x01\x42\x12\n\x10_fan_speed_ratio"\xfb\x01\n\x14\x43oolingConfiguration\x12"\n\x15minimum_required_fans\x18\x01 \x01(\rH\x01\x88\x01\x01\x12/\n\x04\x61uto\x18\x02 \x01(\x0b\x32\x1f.braiins.bos.v1.CoolingAutoModeH\x00\x12\x33\n\x06manual\x18\x03 \x01(\x0b\x32!.braiins.bos.v1.CoolingManualModeH\x00\x12\x37\n\x08\x64isabled\x18\x04 \x01(\x0b\x32#.braiins.bos.v1.CoolingDisabledModeH\x00\x42\x06\n\x04modeB\x18\n\x16_minimum_required_fans"\x99\x03\n\x12\x43oolingConstraints\x12\x39\n\x14\x64\x65\x66\x61ult_cooling_mode\x18\x01 \x01(\x0e\x32\x1b.braiins.bos.v1.CoolingMode\x12\x42\n\x12target_temperature\x18\x02 \x01(\x0b\x32&.braiins.bos.v1.TemperatureConstraints\x12?\n\x0fhot_temperature\x18\x03 \x01(\x0b\x32&.braiins.bos.v1.TemperatureConstraints\x12\x45\n\x15\x64\x61ngerous_temperature\x18\x04 \x01(\x0b\x32&.braiins.bos.v1.TemperatureConstraints\x12:\n\x0f\x66\x61n_speed_ratio\x18\x05 \x01(\x0b\x32!.braiins.bos.v1.DoubleConstraints\x12@\n\x15minimum_required_fans\x18\x06 \x01(\x0b\x32!.braiins.bos.v1.UInt32Constraints"s\n\x08\x46\x61nState\x12\x15\n\x08position\x18\x01 \x01(\rH\x00\x88\x01\x01\x12\x0b\n\x03rpm\x18\x02 \x01(\r\x12\x1f\n\x12target_speed_ratio\x18\x03 \x01(\x01H\x01\x88\x01\x01\x42\x0b\n\t_positionB\x15\n\x13_target_speed_ratio"\x8f\x01\n\x11TemperatureSensor\x12\x0f\n\x02id\x18\x01 \x01(\rH\x00\x88\x01\x01\x12\x30\n\x08location\x18\x02 \x01(\x0e\x32\x1e.braiins.bos.v1.SensorLocation\x12\x30\n\x0btemperature\x18\x03 \x01(\x0b\x32\x1b.braiins.bos.v1.TemperatureB\x05\n\x03_id"\x18\n\x16GetCoolingStateRequest"\x81\x01\n\x17GetCoolingStateResponse\x12&\n\x04\x66\x61ns\x18\x01 \x03(\x0b\x32\x18.braiins.bos.v1.FanState\x12>\n\x13highest_temperature\x18\x02 \x01(\x0b\x32!.braiins.bos.v1.TemperatureSensor"i\n\x17SetImmersionModeRequest\x12/\n\x0bsave_action\x18\x01 \x01(\x0e\x32\x1a.braiins.bos.v1.SaveAction\x12\x1d\n\x15\x65nable_immersion_mode\x18\x02 \x01(\x08"2\n\x18SetImmersionModeResponse\x12\x16\n\x0eimmersion_mode\x18\x01 \x01(\x08*v\n\x0b\x43oolingMode\x12\x1c\n\x18\x43OOLING_MODE_UNSPECIFIED\x10\x00\x12\x15\n\x11\x43OOLING_MODE_AUTO\x10\x01\x12\x17\n\x13\x43OOLING_MODE_MANUAL\x10\x02\x12\x19\n\x15\x43OOLING_MODE_DISABLED\x10\x03*d\n\x0eSensorLocation\x12\x1f\n\x1bSENSOR_LOCATION_UNSPECIFIED\x10\x00\x12\x18\n\x14SENSOR_LOCATION_CHIP\x10\x01\x12\x17\n\x13SENSOR_LOCATION_PCB\x10\x02\x32\xdb\x01\n\x0e\x43oolingService\x12\x62\n\x0fGetCoolingState\x12&.braiins.bos.v1.GetCoolingStateRequest\x1a\'.braiins.bos.v1.GetCoolingStateResponse\x12\x65\n\x10SetImmersionMode\x12\'.braiins.bos.v1.SetImmersionModeRequest\x1a(.braiins.bos.v1.SetImmersionModeResponseb\x06proto3'
|
||||
)
|
||||
|
||||
_globals = globals()
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.cooling_pb2", _globals)
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
DESCRIPTOR._options = None
|
||||
_globals["_COOLINGMODE"]._serialized_start = 1803
|
||||
_globals["_COOLINGMODE"]._serialized_end = 1921
|
||||
_globals["_SENSORLOCATION"]._serialized_start = 1923
|
||||
_globals["_SENSORLOCATION"]._serialized_end = 2023
|
||||
_globals["_COOLINGAUTOMODE"]._serialized_start = 108
|
||||
_globals["_COOLINGAUTOMODE"]._serialized_end = 296
|
||||
_globals["_COOLINGMANUALMODE"]._serialized_start = 299
|
||||
_globals["_COOLINGMANUALMODE"]._serialized_end = 482
|
||||
_globals["_COOLINGDISABLEDMODE"]._serialized_start = 484
|
||||
_globals["_COOLINGDISABLEDMODE"]._serialized_end = 555
|
||||
_globals["_COOLINGCONFIGURATION"]._serialized_start = 558
|
||||
_globals["_COOLINGCONFIGURATION"]._serialized_end = 809
|
||||
_globals["_COOLINGCONSTRAINTS"]._serialized_start = 812
|
||||
_globals["_COOLINGCONSTRAINTS"]._serialized_end = 1221
|
||||
_globals["_FANSTATE"]._serialized_start = 1223
|
||||
_globals["_FANSTATE"]._serialized_end = 1338
|
||||
_globals["_TEMPERATURESENSOR"]._serialized_start = 1341
|
||||
_globals["_TEMPERATURESENSOR"]._serialized_end = 1484
|
||||
_globals["_GETCOOLINGSTATEREQUEST"]._serialized_start = 1486
|
||||
_globals["_GETCOOLINGSTATEREQUEST"]._serialized_end = 1510
|
||||
_globals["_GETCOOLINGSTATERESPONSE"]._serialized_start = 1513
|
||||
_globals["_GETCOOLINGSTATERESPONSE"]._serialized_end = 1642
|
||||
_globals["_SETIMMERSIONMODEREQUEST"]._serialized_start = 1644
|
||||
_globals["_SETIMMERSIONMODEREQUEST"]._serialized_end = 1749
|
||||
_globals["_SETIMMERSIONMODERESPONSE"]._serialized_start = 1751
|
||||
_globals["_SETIMMERSIONMODERESPONSE"]._serialized_end = 1801
|
||||
_globals["_COOLINGSERVICE"]._serialized_start = 2026
|
||||
_globals["_COOLINGSERVICE"]._serialized_end = 2245
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
42
pyasic/web/bosminer/proto/bos/v1/license_pb2.py
Normal file
42
pyasic/web/bosminer/proto/bos/v1/license_pb2.py
Normal file
@@ -0,0 +1,42 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: bos/v1/license.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
from google.protobuf.internal import builder as _builder
|
||||
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
from ...bos.v1 import units_pb2 as bos_dot_v1_dot_units__pb2
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
|
||||
b'\n\x14\x62os/v1/license.proto\x12\x0e\x62raiins.bos.v1\x1a\x12\x62os/v1/units.proto")\n\x0bNoneLicense\x12\x1a\n\x12time_to_restricted\x18\x01 \x01(\r"\x10\n\x0eLimitedLicense"\x9a\x01\n\x0cValidLicense\x12)\n\x04type\x18\x01 \x01(\x0e\x32\x1b.braiins.bos.v1.LicenseType\x12\x15\n\rcontract_name\x18\x02 \x01(\t\x12\x1a\n\x12time_to_restricted\x18\x03 \x01(\r\x12,\n\x07\x64\x65v_fee\x18\x04 \x01(\x0b\x32\x1b.braiins.bos.v1.BasesPoints"\x80\x01\n\x0e\x45xpiredLicense\x12)\n\x04type\x18\x01 \x01(\x0e\x32\x1b.braiins.bos.v1.LicenseType\x12\x15\n\rcontract_name\x18\x02 \x01(\t\x12,\n\x07\x64\x65v_fee\x18\x03 \x01(\x0b\x32\x1b.braiins.bos.v1.BasesPoints"\x18\n\x16GetLicenseStateRequest"\xe4\x01\n\x17GetLicenseStateResponse\x12+\n\x04none\x18\x01 \x01(\x0b\x32\x1b.braiins.bos.v1.NoneLicenseH\x00\x12\x31\n\x07limited\x18\x02 \x01(\x0b\x32\x1e.braiins.bos.v1.LimitedLicenseH\x00\x12-\n\x05valid\x18\x03 \x01(\x0b\x32\x1c.braiins.bos.v1.ValidLicenseH\x00\x12\x31\n\x07\x65xpired\x18\x04 \x01(\x0b\x32\x1e.braiins.bos.v1.ExpiredLicenseH\x00\x42\x07\n\x05state*_\n\x0bLicenseType\x12\x1c\n\x18LICENSE_TYPE_UNSPECIFIED\x10\x00\x12\x19\n\x15LICENSE_TYPE_STANDARD\x10\x01\x12\x17\n\x13LICENSE_TYPE_CUSTOM\x10\x02\x32t\n\x0eLicenseService\x12\x62\n\x0fGetLicenseState\x12&.braiins.bos.v1.GetLicenseStateRequest\x1a\'.braiins.bos.v1.GetLicenseStateResponseb\x06proto3'
|
||||
)
|
||||
|
||||
_globals = globals()
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.license_pb2", _globals)
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
DESCRIPTOR._options = None
|
||||
_globals["_LICENSETYPE"]._serialized_start = 666
|
||||
_globals["_LICENSETYPE"]._serialized_end = 761
|
||||
_globals["_NONELICENSE"]._serialized_start = 60
|
||||
_globals["_NONELICENSE"]._serialized_end = 101
|
||||
_globals["_LIMITEDLICENSE"]._serialized_start = 103
|
||||
_globals["_LIMITEDLICENSE"]._serialized_end = 119
|
||||
_globals["_VALIDLICENSE"]._serialized_start = 122
|
||||
_globals["_VALIDLICENSE"]._serialized_end = 276
|
||||
_globals["_EXPIREDLICENSE"]._serialized_start = 279
|
||||
_globals["_EXPIREDLICENSE"]._serialized_end = 407
|
||||
_globals["_GETLICENSESTATEREQUEST"]._serialized_start = 409
|
||||
_globals["_GETLICENSESTATEREQUEST"]._serialized_end = 433
|
||||
_globals["_GETLICENSESTATERESPONSE"]._serialized_start = 436
|
||||
_globals["_GETLICENSESTATERESPONSE"]._serialized_end = 664
|
||||
_globals["_LICENSESERVICE"]._serialized_start = 763
|
||||
_globals["_LICENSESERVICE"]._serialized_end = 879
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
84
pyasic/web/bosminer/proto/bos/v1/miner_pb2.py
Normal file
84
pyasic/web/bosminer/proto/bos/v1/miner_pb2.py
Normal file
File diff suppressed because one or more lines are too long
110
pyasic/web/bosminer/proto/bos/v1/performance.py
Normal file
110
pyasic/web/bosminer/proto/bos/v1/performance.py
Normal file
@@ -0,0 +1,110 @@
|
||||
# ------------------------------------------------------------------------------
|
||||
# Copyright 2022 Upstream Data Inc -
|
||||
# -
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); -
|
||||
# you may not use this file except in compliance with the License. -
|
||||
# You may obtain a copy of the License at -
|
||||
# -
|
||||
# http://www.apache.org/licenses/LICENSE-2.0 -
|
||||
# -
|
||||
# Unless required by applicable law or agreed to in writing, software -
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, -
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
from dataclasses import asdict, dataclass
|
||||
from typing import Union
|
||||
|
||||
|
||||
@dataclass
|
||||
class Frequency:
|
||||
hertz: float
|
||||
|
||||
|
||||
@dataclass
|
||||
class Voltage:
|
||||
volt: float
|
||||
|
||||
|
||||
@dataclass
|
||||
class Power:
|
||||
watt: int
|
||||
|
||||
|
||||
@dataclass
|
||||
class TeraHashrate:
|
||||
terahash_per_second: float
|
||||
|
||||
|
||||
@dataclass
|
||||
class HashboardPerformanceSettings:
|
||||
id: str
|
||||
frequency: Frequency
|
||||
voltage: Voltage
|
||||
|
||||
|
||||
@dataclass
|
||||
class ManualPerformanceMode:
|
||||
global_frequency: Frequency
|
||||
global_voltage: Voltage
|
||||
hashboards: list[HashboardPerformanceSettings]
|
||||
|
||||
|
||||
@dataclass
|
||||
class PowerTargetMode:
|
||||
power_target: Power
|
||||
|
||||
|
||||
@dataclass
|
||||
class HashrateTargetMode:
|
||||
hashrate_target: TeraHashrate
|
||||
|
||||
|
||||
@dataclass
|
||||
class TunerPerformanceMode:
|
||||
target: Union[PowerTargetMode, HashrateTargetMode]
|
||||
|
||||
|
||||
@dataclass
|
||||
class PerformanceMode:
|
||||
mode: Union[ManualPerformanceMode, TunerPerformanceMode]
|
||||
|
||||
@classmethod
|
||||
def create(
|
||||
cls,
|
||||
power_target: int = None,
|
||||
hashrate_target: float = None,
|
||||
manual_configuration: ManualPerformanceMode = None,
|
||||
):
|
||||
provided_args = [power_target, hashrate_target, manual_configuration]
|
||||
if sum(arg is not None for arg in provided_args) > 1:
|
||||
raise ValueError(
|
||||
"More than one keyword argument provided. Please use only power target, hashrate target, or manual config."
|
||||
)
|
||||
elif sum(arg is not None for arg in provided_args) < 1:
|
||||
raise ValueError(
|
||||
"Please pass one of power target, hashrate target, or manual config."
|
||||
)
|
||||
|
||||
if power_target is not None:
|
||||
return cls(
|
||||
mode=TunerPerformanceMode(
|
||||
target=PowerTargetMode(power_target=Power(watt=power_target))
|
||||
)
|
||||
)
|
||||
elif hashrate_target is not None:
|
||||
return cls(
|
||||
mode=TunerPerformanceMode(
|
||||
target=HashrateTargetMode(
|
||||
hashrate_target=TeraHashrate(
|
||||
terahash_per_second=hashrate_target
|
||||
)
|
||||
)
|
||||
)
|
||||
)
|
||||
elif manual_configuration is not None:
|
||||
return cls(mode=manual_configuration)
|
||||
|
||||
def as_dict(self):
|
||||
return asdict(self)
|
||||
112
pyasic/web/bosminer/proto/bos/v1/performance_pb2.py
Normal file
112
pyasic/web/bosminer/proto/bos/v1/performance_pb2.py
Normal file
File diff suppressed because one or more lines are too long
75
pyasic/web/bosminer/proto/bos/v1/pool_pb2.py
Normal file
75
pyasic/web/bosminer/proto/bos/v1/pool_pb2.py
Normal file
@@ -0,0 +1,75 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
# ------------------------------------------------------------------------------
|
||||
# Copyright 2022 Upstream Data Inc -
|
||||
# -
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); -
|
||||
# you may not use this file except in compliance with the License. -
|
||||
# You may obtain a copy of the License at -
|
||||
# -
|
||||
# http://www.apache.org/licenses/LICENSE-2.0 -
|
||||
# -
|
||||
# Unless required by applicable law or agreed to in writing, software -
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, -
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: bos/v1/pool.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
from google.protobuf.internal import builder as _builder
|
||||
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
from ...bos.v1 import common_pb2 as bos_dot_v1_dot_common__pb2
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
|
||||
b'\n\x11\x62os/v1/pool.proto\x12\x0e\x62raiins.bos.v1\x1a\x13\x62os/v1/common.proto"\x16\n\x05Quota\x12\r\n\x05value\x18\x01 \x01(\r" \n\x0f\x46ixedShareRatio\x12\r\n\x05value\x18\x01 \x01(\x01"\xe4\x01\n\x16PoolGroupConfiguration\x12\x0b\n\x03uid\x18\x01 \x01(\t\x12\x0c\n\x04name\x18\x02 \x01(\t\x12&\n\x05quota\x18\x03 \x01(\x0b\x32\x15.braiins.bos.v1.QuotaH\x00\x12<\n\x11\x66ixed_share_ratio\x18\x04 \x01(\x0b\x32\x1f.braiins.bos.v1.FixedShareRatioH\x00\x12\x30\n\x05pools\x18\x05 \x03(\x0b\x32!.braiins.bos.v1.PoolConfigurationB\x17\n\x15load_balance_strategy"\x81\x01\n\x11PoolConfiguration\x12\x0b\n\x03uid\x18\x01 \x01(\t\x12\x0b\n\x03url\x18\x02 \x01(\t\x12\x0c\n\x04user\x18\x03 \x01(\t\x12\x15\n\x08password\x18\x04 \x01(\tH\x00\x88\x01\x01\x12\x14\n\x07\x65nabled\x18\x05 \x01(\x08H\x01\x88\x01\x01\x42\x0b\n\t_passwordB\n\n\x08_enabled"\xb0\x01\n\tPoolGroup\x12\x0c\n\x04name\x18\x01 \x01(\t\x12&\n\x05quota\x18\x02 \x01(\x0b\x32\x15.braiins.bos.v1.QuotaH\x00\x12<\n\x11\x66ixed_share_ratio\x18\x03 \x01(\x0b\x32\x1f.braiins.bos.v1.FixedShareRatioH\x00\x12#\n\x05pools\x18\x04 \x03(\x0b\x32\x14.braiins.bos.v1.PoolB\n\n\x08strategy"\x88\x01\n\x04Pool\x12\x0b\n\x03uid\x18\x01 \x01(\t\x12\x0b\n\x03url\x18\x02 \x01(\t\x12\x0c\n\x04user\x18\x03 \x01(\t\x12\x0f\n\x07\x65nabled\x18\x04 \x01(\x08\x12\r\n\x05\x61live\x18\x05 \x01(\x08\x12\x0e\n\x06\x61\x63tive\x18\x06 \x01(\x08\x12(\n\x05stats\x18\x07 \x01(\x0b\x32\x19.braiins.bos.v1.PoolStats"\x98\x01\n\tPoolStats\x12\x17\n\x0f\x61\x63\x63\x65pted_shares\x18\x01 \x01(\x04\x12\x17\n\x0frejected_shares\x18\x02 \x01(\x04\x12\x14\n\x0cstale_shares\x18\x03 \x01(\x04\x12\x17\n\x0flast_difficulty\x18\x04 \x01(\x04\x12\x12\n\nbest_share\x18\x05 \x01(\x04\x12\x16\n\x0egenerated_work\x18\x06 \x01(\x04"\x16\n\x14GetPoolGroupsRequest"G\n\x15GetPoolGroupsResponse\x12.\n\x0bpool_groups\x18\x01 \x03(\x0b\x32\x19.braiins.bos.v1.PoolGroup"\x80\x01\n\x16\x43reatePoolGroupRequest\x12/\n\x0bsave_action\x18\x01 \x01(\x0e\x32\x1a.braiins.bos.v1.SaveAction\x12\x35\n\x05group\x18\x02 \x01(\x0b\x32&.braiins.bos.v1.PoolGroupConfiguration"P\n\x17\x43reatePoolGroupResponse\x12\x35\n\x05group\x18\x01 \x01(\x0b\x32&.braiins.bos.v1.PoolGroupConfiguration"\x80\x01\n\x16UpdatePoolGroupRequest\x12/\n\x0bsave_action\x18\x01 \x01(\x0e\x32\x1a.braiins.bos.v1.SaveAction\x12\x35\n\x05group\x18\x02 \x01(\x0b\x32&.braiins.bos.v1.PoolGroupConfiguration"P\n\x17UpdatePoolGroupResponse\x12\x35\n\x05group\x18\x01 \x01(\x0b\x32&.braiins.bos.v1.PoolGroupConfiguration"V\n\x16RemovePoolGroupRequest\x12/\n\x0bsave_action\x18\x01 \x01(\x0e\x32\x1a.braiins.bos.v1.SaveAction\x12\x0b\n\x03uid\x18\x02 \x01(\t"\x19\n\x17RemovePoolGroupResponse2\x97\x03\n\x0bPoolService\x12\\\n\rGetPoolGroups\x12$.braiins.bos.v1.GetPoolGroupsRequest\x1a%.braiins.bos.v1.GetPoolGroupsResponse\x12\x62\n\x0f\x43reatePoolGroup\x12&.braiins.bos.v1.CreatePoolGroupRequest\x1a\'.braiins.bos.v1.CreatePoolGroupResponse\x12\x62\n\x0fUpdatePoolGroup\x12&.braiins.bos.v1.UpdatePoolGroupRequest\x1a\'.braiins.bos.v1.UpdatePoolGroupResponse\x12\x62\n\x0fRemovePoolGroup\x12&.braiins.bos.v1.RemovePoolGroupRequest\x1a\'.braiins.bos.v1.RemovePoolGroupResponseb\x06proto3'
|
||||
)
|
||||
|
||||
_globals = globals()
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.pool_pb2", _globals)
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
DESCRIPTOR._options = None
|
||||
_globals["_QUOTA"]._serialized_start = 58
|
||||
_globals["_QUOTA"]._serialized_end = 80
|
||||
_globals["_FIXEDSHARERATIO"]._serialized_start = 82
|
||||
_globals["_FIXEDSHARERATIO"]._serialized_end = 114
|
||||
_globals["_POOLGROUPCONFIGURATION"]._serialized_start = 117
|
||||
_globals["_POOLGROUPCONFIGURATION"]._serialized_end = 345
|
||||
_globals["_POOLCONFIGURATION"]._serialized_start = 348
|
||||
_globals["_POOLCONFIGURATION"]._serialized_end = 477
|
||||
_globals["_POOLGROUP"]._serialized_start = 480
|
||||
_globals["_POOLGROUP"]._serialized_end = 656
|
||||
_globals["_POOL"]._serialized_start = 659
|
||||
_globals["_POOL"]._serialized_end = 795
|
||||
_globals["_POOLSTATS"]._serialized_start = 798
|
||||
_globals["_POOLSTATS"]._serialized_end = 950
|
||||
_globals["_GETPOOLGROUPSREQUEST"]._serialized_start = 952
|
||||
_globals["_GETPOOLGROUPSREQUEST"]._serialized_end = 974
|
||||
_globals["_GETPOOLGROUPSRESPONSE"]._serialized_start = 976
|
||||
_globals["_GETPOOLGROUPSRESPONSE"]._serialized_end = 1047
|
||||
_globals["_CREATEPOOLGROUPREQUEST"]._serialized_start = 1050
|
||||
_globals["_CREATEPOOLGROUPREQUEST"]._serialized_end = 1178
|
||||
_globals["_CREATEPOOLGROUPRESPONSE"]._serialized_start = 1180
|
||||
_globals["_CREATEPOOLGROUPRESPONSE"]._serialized_end = 1260
|
||||
_globals["_UPDATEPOOLGROUPREQUEST"]._serialized_start = 1263
|
||||
_globals["_UPDATEPOOLGROUPREQUEST"]._serialized_end = 1391
|
||||
_globals["_UPDATEPOOLGROUPRESPONSE"]._serialized_start = 1393
|
||||
_globals["_UPDATEPOOLGROUPRESPONSE"]._serialized_end = 1473
|
||||
_globals["_REMOVEPOOLGROUPREQUEST"]._serialized_start = 1475
|
||||
_globals["_REMOVEPOOLGROUPREQUEST"]._serialized_end = 1561
|
||||
_globals["_REMOVEPOOLGROUPRESPONSE"]._serialized_start = 1563
|
||||
_globals["_REMOVEPOOLGROUPRESPONSE"]._serialized_end = 1588
|
||||
_globals["_POOLSERVICE"]._serialized_start = 1591
|
||||
_globals["_POOLSERVICE"]._serialized_end = 1998
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
61
pyasic/web/bosminer/proto/bos/v1/units_pb2.py
Normal file
61
pyasic/web/bosminer/proto/bos/v1/units_pb2.py
Normal file
@@ -0,0 +1,61 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
# ------------------------------------------------------------------------------
|
||||
# Copyright 2022 Upstream Data Inc -
|
||||
# -
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); -
|
||||
# you may not use this file except in compliance with the License. -
|
||||
# You may obtain a copy of the License at -
|
||||
# -
|
||||
# http://www.apache.org/licenses/LICENSE-2.0 -
|
||||
# -
|
||||
# Unless required by applicable law or agreed to in writing, software -
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, -
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: bos/v1/units.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
from google.protobuf.internal import builder as _builder
|
||||
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
|
||||
b'\n\x12\x62os/v1/units.proto\x12\x0e\x62raiins.bos.v1"+\n\x0cMegaHashrate\x12\x1b\n\x13megahash_per_second\x18\x01 \x01(\x01"+\n\x0cGigaHashrate\x12\x1b\n\x13gigahash_per_second\x18\x01 \x01(\x01"+\n\x0cTeraHashrate\x12\x1b\n\x13terahash_per_second\x18\x01 \x01(\x01"\x1a\n\tFrequency\x12\r\n\x05hertz\x18\x01 \x01(\x01"\x17\n\x07Voltage\x12\x0c\n\x04volt\x18\x01 \x01(\x01"\x15\n\x05Power\x12\x0c\n\x04watt\x18\x01 \x01(\x04"-\n\x0fPowerEfficiency\x12\x1a\n\x12joule_per_terahash\x18\x01 \x01(\x01"\x1f\n\x0bTemperature\x12\x10\n\x08\x64\x65gree_c\x18\x01 \x01(\x01"\x1a\n\x0b\x42\x61sesPoints\x12\x0b\n\x03\x62sp\x18\x01 \x01(\r"\x16\n\x05Hours\x12\r\n\x05hours\x18\x01 \x01(\rb\x06proto3'
|
||||
)
|
||||
|
||||
_globals = globals()
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.units_pb2", _globals)
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
DESCRIPTOR._options = None
|
||||
_globals["_MEGAHASHRATE"]._serialized_start = 38
|
||||
_globals["_MEGAHASHRATE"]._serialized_end = 81
|
||||
_globals["_GIGAHASHRATE"]._serialized_start = 83
|
||||
_globals["_GIGAHASHRATE"]._serialized_end = 126
|
||||
_globals["_TERAHASHRATE"]._serialized_start = 128
|
||||
_globals["_TERAHASHRATE"]._serialized_end = 171
|
||||
_globals["_FREQUENCY"]._serialized_start = 173
|
||||
_globals["_FREQUENCY"]._serialized_end = 199
|
||||
_globals["_VOLTAGE"]._serialized_start = 201
|
||||
_globals["_VOLTAGE"]._serialized_end = 224
|
||||
_globals["_POWER"]._serialized_start = 226
|
||||
_globals["_POWER"]._serialized_end = 247
|
||||
_globals["_POWEREFFICIENCY"]._serialized_start = 249
|
||||
_globals["_POWEREFFICIENCY"]._serialized_end = 294
|
||||
_globals["_TEMPERATURE"]._serialized_start = 296
|
||||
_globals["_TEMPERATURE"]._serialized_end = 327
|
||||
_globals["_BASESPOINTS"]._serialized_start = 329
|
||||
_globals["_BASESPOINTS"]._serialized_end = 355
|
||||
_globals["_HOURS"]._serialized_start = 357
|
||||
_globals["_HOURS"]._serialized_end = 379
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
47
pyasic/web/bosminer/proto/bos/v1/work_pb2.py
Normal file
47
pyasic/web/bosminer/proto/bos/v1/work_pb2.py
Normal file
@@ -0,0 +1,47 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
# ------------------------------------------------------------------------------
|
||||
# Copyright 2022 Upstream Data Inc -
|
||||
# -
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); -
|
||||
# you may not use this file except in compliance with the License. -
|
||||
# You may obtain a copy of the License at -
|
||||
# -
|
||||
# http://www.apache.org/licenses/LICENSE-2.0 -
|
||||
# -
|
||||
# Unless required by applicable law or agreed to in writing, software -
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, -
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: bos/v1/work.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
from google.protobuf.internal import builder as _builder
|
||||
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
from ...bos.v1 import units_pb2 as bos_dot_v1_dot_units__pb2
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
|
||||
b'\n\x11\x62os/v1/work.proto\x12\x0e\x62raiins.bos.v1\x1a\x12\x62os/v1/units.proto"\xef\x03\n\x0cRealHashrate\x12-\n\x07last_5s\x18\x01 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12.\n\x08last_15s\x18\x02 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12.\n\x08last_30s\x18\x03 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12-\n\x07last_1m\x18\x04 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12-\n\x07last_5m\x18\x05 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12.\n\x08last_15m\x18\x06 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12.\n\x08last_30m\x18\x07 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12-\n\x07last_1h\x18\x08 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12.\n\x08last_24h\x18\t \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12\x33\n\rsince_restart\x18\n \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate"\xde\x01\n\x0fWorkSolverStats\x12\x33\n\rreal_hashrate\x18\x01 \x01(\x0b\x32\x1c.braiins.bos.v1.RealHashrate\x12\x36\n\x10nominal_hashrate\x18\x02 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12\x34\n\x0e\x65rror_hashrate\x18\x03 \x01(\x0b\x32\x1c.braiins.bos.v1.MegaHashrate\x12\x14\n\x0c\x66ound_blocks\x18\x04 \x01(\r\x12\x12\n\nbest_share\x18\x05 \x01(\x04\x62\x06proto3'
|
||||
)
|
||||
|
||||
_globals = globals()
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.work_pb2", _globals)
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
DESCRIPTOR._options = None
|
||||
_globals["_REALHASHRATE"]._serialized_start = 58
|
||||
_globals["_REALHASHRATE"]._serialized_end = 553
|
||||
_globals["_WORKSOLVERSTATS"]._serialized_start = 556
|
||||
_globals["_WORKSOLVERSTATS"]._serialized_end = 778
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
47
pyasic/web/bosminer/proto/bos/version_pb2.py
Normal file
47
pyasic/web/bosminer/proto/bos/version_pb2.py
Normal file
@@ -0,0 +1,47 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
# ------------------------------------------------------------------------------
|
||||
# Copyright 2022 Upstream Data Inc -
|
||||
# -
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); -
|
||||
# you may not use this file except in compliance with the License. -
|
||||
# You may obtain a copy of the License at -
|
||||
# -
|
||||
# http://www.apache.org/licenses/LICENSE-2.0 -
|
||||
# -
|
||||
# Unless required by applicable law or agreed to in writing, software -
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, -
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
|
||||
# See the License for the specific language governing permissions and -
|
||||
# limitations under the License. -
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: bos/version.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
from google.protobuf.internal import builder as _builder
|
||||
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
|
||||
b'\n\x11\x62os/version.proto\x12\x0b\x62raiins.bos"U\n\nApiVersion\x12\r\n\x05major\x18\x01 \x01(\x04\x12\r\n\x05minor\x18\x02 \x01(\x04\x12\r\n\x05patch\x18\x03 \x01(\x04\x12\x0b\n\x03pre\x18\x04 \x01(\t\x12\r\n\x05\x62uild\x18\x05 \x01(\t"\x13\n\x11\x41piVersionRequest2]\n\x11\x41piVersionService\x12H\n\rGetApiVersion\x12\x1e.braiins.bos.ApiVersionRequest\x1a\x17.braiins.bos.ApiVersionb\x06proto3'
|
||||
)
|
||||
|
||||
_globals = globals()
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.version_pb2", _globals)
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
DESCRIPTOR._options = None
|
||||
_globals["_APIVERSION"]._serialized_start = 34
|
||||
_globals["_APIVERSION"]._serialized_end = 119
|
||||
_globals["_APIVERSIONREQUEST"]._serialized_start = 121
|
||||
_globals["_APIVERSIONREQUEST"]._serialized_end = 140
|
||||
_globals["_APIVERSIONSERVICE"]._serialized_start = 142
|
||||
_globals["_APIVERSIONSERVICE"]._serialized_end = 235
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
@@ -19,7 +19,7 @@ from typing import Union
|
||||
|
||||
import httpx
|
||||
|
||||
from pyasic.settings import PyasicSettings
|
||||
from pyasic import settings
|
||||
from pyasic.web import BaseWebAPI
|
||||
|
||||
|
||||
@@ -27,7 +27,7 @@ class GoldshellWebAPI(BaseWebAPI):
|
||||
def __init__(self, ip: str) -> None:
|
||||
super().__init__(ip)
|
||||
self.username = "admin"
|
||||
self.pwd = PyasicSettings().global_goldshell_password
|
||||
self.pwd = settings.get("default_goldshell_password", "123456789")
|
||||
self.jwt = None
|
||||
|
||||
async def auth(self):
|
||||
@@ -72,7 +72,7 @@ class GoldshellWebAPI(BaseWebAPI):
|
||||
if not self.jwt:
|
||||
await self.auth()
|
||||
async with httpx.AsyncClient() as client:
|
||||
for i in range(PyasicSettings().miner_get_data_retries):
|
||||
for i in range(settings.get("get_data_retries", 1)):
|
||||
try:
|
||||
if parameters:
|
||||
response = await client.put(
|
||||
|
||||
@@ -19,8 +19,8 @@ from typing import Union
|
||||
|
||||
import httpx
|
||||
|
||||
from pyasic import settings
|
||||
from pyasic.errors import APIError
|
||||
from pyasic.settings import PyasicSettings
|
||||
from pyasic.web import BaseWebAPI
|
||||
|
||||
|
||||
@@ -28,7 +28,7 @@ class InnosiliconWebAPI(BaseWebAPI):
|
||||
def __init__(self, ip: str) -> None:
|
||||
super().__init__(ip)
|
||||
self.username = "admin"
|
||||
self.pwd = PyasicSettings().global_innosilicon_password
|
||||
self.pwd = settings.get("default_innosilicon_password", "admin")
|
||||
self.jwt = None
|
||||
|
||||
async def auth(self):
|
||||
@@ -55,7 +55,7 @@ class InnosiliconWebAPI(BaseWebAPI):
|
||||
if not self.jwt:
|
||||
await self.auth()
|
||||
async with httpx.AsyncClient() as client:
|
||||
for i in range(PyasicSettings().miner_get_data_retries):
|
||||
for i in range(settings.get("get_data_retries", 1)):
|
||||
try:
|
||||
response = await client.post(
|
||||
f"http://{self.ip}/api/{command}",
|
||||
|
||||
@@ -19,7 +19,7 @@ from typing import Union
|
||||
|
||||
import httpx
|
||||
|
||||
from pyasic.settings import PyasicSettings
|
||||
from pyasic import settings
|
||||
from pyasic.web import BaseWebAPI
|
||||
|
||||
|
||||
@@ -27,7 +27,7 @@ class VNishWebAPI(BaseWebAPI):
|
||||
def __init__(self, ip: str) -> None:
|
||||
super().__init__(ip)
|
||||
self.username = "admin"
|
||||
self.pwd = PyasicSettings().global_vnish_password
|
||||
self.pwd = settings.get("default_vnish_password", "admin")
|
||||
self.token = None
|
||||
|
||||
async def auth(self):
|
||||
@@ -59,7 +59,7 @@ class VNishWebAPI(BaseWebAPI):
|
||||
if not self.token:
|
||||
await self.auth()
|
||||
async with httpx.AsyncClient() as client:
|
||||
for i in range(PyasicSettings().miner_get_data_retries):
|
||||
for i in range(settings.get("get_data_retries", 1)):
|
||||
try:
|
||||
auth = self.token
|
||||
if command.startswith("system"):
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
[tool.poetry]
|
||||
name = "pyasic"
|
||||
version = "0.37.6"
|
||||
description = "A set of modules for interfacing with many common types of ASIC bitcoin miners, using both their API and SSH."
|
||||
version = "0.40.5"
|
||||
description = "A simplified and standardized interface for Bitcoin ASICs."
|
||||
authors = ["UpstreamData <brett@upstreamdata.ca>"]
|
||||
repository = "https://github.com/UpstreamData/pyasic"
|
||||
documentation = "https://pyasic.readthedocs.io/en/latest/"
|
||||
@@ -9,10 +9,11 @@ readme = "README.md"
|
||||
|
||||
[tool.poetry.dependencies]
|
||||
python = "^3.8"
|
||||
asyncssh = "^2.13.1"
|
||||
httpx = "^0.24.0"
|
||||
httpx = "^0.25.0"
|
||||
asyncssh = "^2.14.1"
|
||||
grpc-requests = "^0.1.11"
|
||||
passlib = "^1.7.4"
|
||||
pyaml = "^23.5.9"
|
||||
pyaml = "^23.9.7"
|
||||
toml = "^0.10.2"
|
||||
|
||||
[tool.poetry.group.dev]
|
||||
|
||||
Reference in New Issue
Block a user