Compare commits

..

43 Commits

Author SHA1 Message Date
Upstream Data
7a75818a20 version: bump version number. 2023-12-17 09:09:00 -07:00
Upstream Data
d2be68d35e bug: fix MinerConfig default values for 3.11+. Add MinerConfig.as_epic default implementation. 2023-12-17 09:08:14 -07:00
Upstream Data
c5c4bb10ee version: bump version number. 2023-12-16 10:59:23 -07:00
Upstream Data
c4dfdda448 Merge branch 'dev_bugs'
# Conflicts:
#	pyasic/miners/miner_factory.py
#	pyasic/miners/types/whatsminer/M6X/M60.py
#	pyasic/miners/types/whatsminer/M6X/M60S.py
#	pyasic/miners/types/whatsminer/M6X/M63.py
#	pyasic/miners/types/whatsminer/M6X/M63S.py
#	pyasic/miners/types/whatsminer/M6X/M66.py
#	pyasic/miners/types/whatsminer/M6X/M66S.py
#	pyasic/miners/types/whatsminer/M6X/__init__.py
#	pyasic/miners/whatsminer/btminer/M6X/M60.py
#	pyasic/miners/whatsminer/btminer/M6X/M60S.py
#	pyasic/miners/whatsminer/btminer/M6X/M66S.py
#	pyasic/miners/whatsminer/btminer/M6X/__init__.py
#	pyasic/miners/whatsminer/btminer/__init__.py
2023-12-16 10:55:27 -07:00
Upstream Data
4459de2260 feature: add support for S19kProNoPIC BOS. Reformat. 2023-12-16 10:54:51 -07:00
UpstreamData
201cfd7ef9 docs: update documentation to be more readable on the main page. 2023-12-13 11:15:03 -07:00
UpstreamData
4201905fdd bug: fix some tasks not being cancelled properly in miner factory. 2023-12-13 10:18:28 -07:00
checksum0
497ffb5bc0 Add all the currently known Whatsminer M6X machines (#77)
* Create new BTMiner M6X backend class to represent Whatsminer new M6X generation

* Add all new known types of Whatsminer M6X

* Ensure all new types are imported in their respective __init__.py

* Create all BTMiner API class for known types of new M6X generation

* Ensure all new BTMiner API class are imported in __init__.py

* Fix erroneous M6X models data

* Ensure M6X miners are imported and add them to their MinerTypes dictionary in miner_factory.py
2023-12-12 19:38:36 -07:00
checksum0
2f762c95db Add all the currently known Whatsminer M6X machines (#77)
* Create new BTMiner M6X backend class to represent Whatsminer new M6X generation

* Add all new known types of Whatsminer M6X

* Ensure all new types are imported in their respective __init__.py

* Create all BTMiner API class for known types of new M6X generation

* Ensure all new BTMiner API class are imported in __init__.py

* Fix erroneous M6X models data

* Ensure M6X miners are imported and add them to their MinerTypes dictionary in miner_factory.py
2023-12-12 19:32:12 -07:00
UpstreamData
67aed79330 bug: fix mode spec in bosminer config. 2023-12-12 13:21:50 -07:00
UpstreamData
073e048726 bug: fix bosminer config missing format information. 2023-12-12 13:11:49 -07:00
UpstreamData
02234f3d1e feature: improve dict merging speed 2023-12-12 09:25:43 -07:00
UpstreamData
dc22df0280 refactor: remove innosilicon pool comment, as it is correct. 2023-12-12 08:54:24 -07:00
UpstreamData
02056b8c88 refactor: remove config prints. 2023-12-11 15:36:02 -07:00
UpstreamData
3a43cd293c bug: Fix improper naming of fan mode. 2023-12-11 15:18:23 -07:00
UpstreamData
6941d9f349 bug: add default case for work mode when there is no work mode returned from bitmain. 2023-12-11 15:08:57 -07:00
UpstreamData
f6b0b64d86 bug: set default quota to 1. 2023-12-11 14:07:17 -07:00
UpstreamData
8d68dd9dac refactor: re-order config keys 2023-12-11 14:06:22 -07:00
UpstreamData
27368a9bd2 bug: fix some issues, and remove unused imports. 2023-12-11 13:48:26 -07:00
UpstreamData
c919b00312 feature: allow config conversion to and from dict. 2023-12-11 13:40:10 -07:00
UpstreamData
f162529883 feature: allow dps conversion for bos grpc. 2023-12-11 11:40:46 -07:00
Upstream Data
bb182bb22d bug: fix some issues with return types and missing return statements. 2023-12-10 20:28:06 -07:00
Upstream Data
af15c4fbd1 bug: pin working betterproto version. 2023-12-10 20:25:27 -07:00
Upstream Data
47c2eb9f0e feature: use betterproto + grpclib. 2023-12-10 20:10:11 -07:00
Upstream Data
1ab39f5873 bug: fix bosminer config parsing. 2023-12-10 17:40:39 -07:00
Upstream Data
43200a7354 feature: Add bosminer.toml parser. 2023-12-10 13:20:03 -07:00
Upstream Data
4fc57832e1 feature: Finish fixing get and send config handlers for miners. 2023-12-10 10:14:57 -07:00
Upstream Data
9ee63cc3ab feature: Update get and send config methods for most miners, and add as_inno. 2023-12-10 10:10:55 -07:00
Upstream Data
b22b506d55 feature: Add whatsminer send_config. 2023-12-10 09:55:05 -07:00
Upstream Data
468fba3465 feature: Add whatsminer set mode commands. 2023-12-10 09:49:24 -07:00
Upstream Data
0399094197 feature: add AM old and goldshell configs. 2023-12-10 09:45:34 -07:00
Upstream Data
bfdfa8a6ab feature: Add AM modern send and get config. 2023-12-10 09:30:31 -07:00
Upstream Data
83d0d09b0d feature: Add whatsminer get_config. 2023-12-09 17:35:47 -07:00
Upstream Data
f892c3a0fd feature: Add from am_modern to config. 2023-12-09 16:59:39 -07:00
Upstream Data
81b974f565 bug: fix bad indentation. 2023-12-09 15:12:36 -07:00
UpstreamData
5eaf876c6d feature: add bos to config miner types. 2023-12-09 13:27:23 -07:00
Upstream Data
d7d1b845a7 feature: add MinerConfig.from_api(). 2023-12-09 13:06:52 -07:00
UpstreamData
242517a36a feature: add inno to config miner types. 2023-12-08 11:03:36 -07:00
UpstreamData
791249bf3d feature: add avalon and goldshell to miner config types. 2023-12-08 10:57:57 -07:00
UpstreamData
5a70a27f07 reformat: remove some useless files. 2023-12-08 10:11:43 -07:00
UpstreamData
bca81f3bca feature: add AM old and modern, and WM config implementation. 2023-12-08 10:10:21 -07:00
UpstreamData
6d75565baf feature: start adding new config implementation. 2023-12-08 09:16:04 -07:00
JP Compagnone
9f42e6a3be add new Antminer models (S19jPro+ and S19k Pro) (#75)
* Add S19jPro+ and S19K Pro

* typo
2023-12-08 08:34:30 -07:00
76 changed files with 5516 additions and 2088 deletions

View File

@@ -6,19 +6,3 @@
options:
show_root_heading: false
heading_level: 4
## Pool Groups
::: pyasic.config._PoolGroup
handler: python
options:
show_root_heading: false
heading_level: 4
## Pools
::: pyasic.config._Pool
handler: python
options:
show_root_heading: false
heading_level: 4

View File

@@ -8,17 +8,20 @@
[![GitHub](https://img.shields.io/github/license/UpstreamData/pyasic)](https://github.com/UpstreamData/pyasic/blob/master/LICENSE.txt)
[![CodeFactor Grade](https://img.shields.io/codefactor/grade/github/UpstreamData/pyasic)](https://www.codefactor.io/repository/github/upstreamdata/pyasic)
---
## Intro
Welcome to pyasic! Pyasic uses an asynchronous method of communicating with asic miners on your network, which makes it super fast.
---
Welcome to `pyasic`! `pyasic` uses an asynchronous method of communicating with ASIC miners on your network, which makes it super fast.
[Supported Miner Types](miners/supported_types.md)
[Click here to view supported miner types](miners/supported_types.md)
Getting started with pyasic is easy. First, find your miner (or miners) on the network by scanning for them or getting the correct class automatically for them if you know the IP.
---
## Getting started
---
Getting started with `pyasic` is easy. First, find your miner (or miners) on the network by scanning for them or getting the correct class automatically for them if you know the IP.
<br>
## Scanning for miners
To scan for miners in pyasic, we use the class [`MinerNetwork`][pyasic.network.MinerNetwork], which abstracts the search, communication, identification, setup, and return of a miner to 1 command.
##### Scanning for miners
To scan for miners in `pyasic`, we use the class [`MinerNetwork`][pyasic.network.MinerNetwork], which abstracts the search, communication, identification, setup, and return of a miner to 1 command.
The command [`MinerNetwork.scan()`][pyasic.network.MinerNetwork.scan] returns a list that contains any miners found.
```python
import asyncio # asyncio for handling the async part
@@ -32,16 +35,15 @@ async def scan_miners(): # define async scan function to allow awaiting
# scan for miners asynchronously
# this will return the correct type of miners if they are supported with all functionality.
miners = await network.scan_network_for_miners()
miners = await network.scan()
print(miners)
if __name__ == "__main__":
asyncio.run(scan_miners()) # run the scan asynchronously with asyncio.run()
```
<br>
## Creating miners based on IP
---
##### Creating miners based on IP
If you already know the IP address of your miner or miners, you can use the [`MinerFactory`][pyasic.miners.miner_factory.MinerFactory] to communicate and identify the miners, or an abstraction of its functionality, [`get_miner()`][pyasic.miners.get_miner].
The function [`get_miner()`][pyasic.miners.get_miner] will return any miner it found at the IP address specified, or an `UnknownMiner` if it cannot identify the miner.
```python
@@ -58,6 +60,8 @@ async def get_miners(): # define async scan function to allow awaiting
print(miner_1, miner_2)
# can also gather these, since they are async
# gathering them will get them both at the same time
# this makes it much faster to get a lot of miners at a time
tasks = [get_miner("192.168.1.75"), get_miner("192.168.1.76")]
miners = await asyncio.gather(*tasks)
print(miners)
@@ -67,13 +71,14 @@ if __name__ == "__main__":
asyncio.run(get_miners()) # get the miners asynchronously with asyncio.run()
```
<br>
## Getting data from miners
Once you have your miner(s) identified, you will likely want to get data from the miner(s). You can do this using a built in function in each miner called `get_data()`.
---
## Data gathering
---
Once you have your miner(s) identified, you will likely want to get data from the miner(s). You can do this using a built-in function in each miner called `get_data()`.
This function will return an instance of the dataclass [`MinerData`][pyasic.data.MinerData] with all data it can gather from the miner.
Each piece of data in a [`MinerData`][pyasic.data.MinerData] instance can be referenced by getting it as an attribute, such as [`MinerData().hashrate`][pyasic.data.MinerData].
##### One miner
```python
import asyncio
from pyasic import get_miner
@@ -88,7 +93,8 @@ async def gather_miner_data():
if __name__ == "__main__":
asyncio.run(gather_miner_data())
```
---
##### Multiple miners
You can do something similar with multiple miners, with only needing to make a small change to get all the data at once.
```python
import asyncio # asyncio for handling the async part
@@ -96,8 +102,8 @@ from pyasic.network import MinerNetwork # miner network handles the scanning
async def gather_miner_data(): # define async scan function to allow awaiting
network = MinerNetwork("192.168.1.50")
miners = await network.scan_network_for_miners()
network = MinerNetwork.from_subnet("192.168.1.50/24")
miners = await network.scan()
# we need to asyncio.gather() all the miners get_data() functions to make them run together
all_miner_data = await asyncio.gather(*[miner.get_data() for miner in miners])
@@ -109,157 +115,56 @@ if __name__ == "__main__":
asyncio.run(gather_miner_data())
```
<br>
## Controlling miners via pyasic
Every miner class in pyasic must implement all the control functions defined in [`BaseMiner`][pyasic.miners.BaseMiner].
---
## Miner control
---
`pyasic` exposes a standard interface for each miner using control functions.
Every miner class in `pyasic` must implement all the control functions defined in [`BaseMiner`][pyasic.miners.BaseMiner].
These functions are
[`check_light`](#check-light),
[`fault_light_off`](#fault-light-off),
[`fault_light_on`](#fault-light-on),
[`get_config`](#get-config),
[`get_data`](#get-data),
[`get_errors`](#get-errors),
[`get_hostname`](#get-hostname),
[`get_model`](#get-model),
[`reboot`](#reboot),
[`restart_backend`](#restart-backend),
[`stop_mining`](#stop-mining),
[`resume_mining`](#resume-mining),
[`is_mining`](#is-mining),
[`send_config`](#send-config), and
[`set_power_limit`](#set-power-limit).
[`check_light`][pyasic.miners.BaseMiner.check_light],
[`fault_light_off`][pyasic.miners.BaseMiner.fault_light_off],
[`fault_light_on`][pyasic.miners.BaseMiner.fault_light_on],
[`get_config`][pyasic.miners.BaseMiner.get_config],
[`get_data`][pyasic.miners.BaseMiner.get_data],
[`get_errors`][pyasic.miners.BaseMiner.get_errors],
[`get_hostname`][pyasic.miners.BaseMiner.get_hostname],
[`get_model`][pyasic.miners.BaseMiner.get_model],
[`reboot`][pyasic.miners.BaseMiner.reboot],
[`restart_backend`][pyasic.miners.BaseMiner.restart_backend],
[`stop_mining`][pyasic.miners.BaseMiner.stop_mining],
[`resume_mining`][pyasic.miners.BaseMiner.resume_mining],
[`is_mining`][pyasic.miners.BaseMiner.is_mining],
[`send_config`][pyasic.miners.BaseMiner.send_config], and
[`set_power_limit`][pyasic.miners.BaseMiner.set_power_limit].
<br>
##### Usage
```python
import asyncio
from pyasic import get_miner
### Check Light
::: pyasic.miners.BaseMiner.check_light
handler: python
options:
heading_level: 4
<br>
async def set_fault_light():
miner = await get_miner("192.168.1.20")
### Fault Light Off
::: pyasic.miners.BaseMiner.fault_light_off
handler: python
options:
heading_level: 4
# call control function
await miner.fault_light_on()
<br>
if __name__ == "__main__":
asyncio.run(set_fault_light())
```
### Fault Light On
::: pyasic.miners.BaseMiner.fault_light_on
handler: python
options:
heading_level: 4
---
## Helper dataclasses
---
<br>
##### [`MinerConfig`][pyasic.config.MinerConfig] and [`MinerData`][pyasic.data.MinerData]
### Get Config
::: pyasic.miners.BaseMiner.get_config
handler: python
options:
heading_level: 4
`pyasic` implements a few dataclasses as helpers to make data return types consistent across different miners and miner APIs. The different fields of these dataclasses can all be viewed with the classmethod `cls.fields()`.
<br>
---
### Get Data
::: pyasic.miners.BaseMiner.get_data
handler: python
options:
heading_level: 4
<br>
### Get Errors
::: pyasic.miners.BaseMiner.get_errors
handler: python
options:
heading_level: 4
<br>
### Get Hostname
::: pyasic.miners.BaseMiner.get_hostname
handler: python
options:
heading_level: 4
<br>
### Get Model
::: pyasic.miners.BaseMiner.get_model
handler: python
options:
heading_level: 4
<br>
### Reboot
::: pyasic.miners.BaseMiner.reboot
handler: python
options:
heading_level: 4
<br>
### Restart Backend
::: pyasic.miners.BaseMiner.restart_backend
handler: python
options:
heading_level: 4
<br>
### Stop Mining
::: pyasic.miners.BaseMiner.stop_mining
handler: python
options:
heading_level: 4
<br>
### Resume Mining
::: pyasic.miners.BaseMiner.resume_mining
handler: python
options:
heading_level: 4
<br>
### Is Mining
::: pyasic.miners.BaseMiner.is_mining
handler: python
options:
heading_level: 4
<br>
### Send Config
::: pyasic.miners.BaseMiner.send_config
handler: python
options:
heading_level: 4
<br>
### Set Power Limit
::: pyasic.miners.BaseMiner.set_power_limit
handler: python
options:
heading_level: 4
<br>
## [`MinerConfig`][pyasic.config.MinerConfig] and [`MinerData`][pyasic.data.MinerData]
Pyasic implements a few dataclasses as helpers to make data return types consistent across different miners and miner APIs. The different fields of these dataclasses can all be viewed with the classmethod `cls.fields()`.
<br>
### [`MinerData`][pyasic.data.MinerData]
##### [`MinerData`][pyasic.data.MinerData]
[`MinerData`][pyasic.data.MinerData] is a return from the [`get_data()`](#get-data) function, and is used to have a consistent dataset across all returns.
@@ -278,19 +183,40 @@ list_of_miner_data = [d1, d2]
average_data = sum(list_of_miner_data, start=MinerData("0.0.0.0"))/len(list_of_miner_data)
```
---
<br>
##### [`MinerConfig`][pyasic.config.MinerConfig]
### [`MinerConfig`][pyasic.config.MinerConfig]
[`MinerConfig`][pyasic.config.MinerConfig] is pyasic's way to represent a configuration file from a miner.
It is the return from [`get_config()`](#get-config).
[`MinerConfig`][pyasic.config.MinerConfig] is `pyasic`'s way to represent a configuration file from a miner.
It is designed to unionize the configuration of all supported miner types, and is the return from [`get_config()`](#get-config).
Each miner has a unique way to convert the [`MinerConfig`][pyasic.config.MinerConfig] to their specific type, there are helper functions in the class.
In most cases these helper functions should not be used, as [`send_config()`](#send-config) takes a [`MinerConfig`][pyasic.config.MinerConfig] and will do the conversion to the right type for you.
You can use the [`MinerConfig`][pyasic.config.MinerConfig] as follows:
```python
import asyncio
from pyasic import get_miner
async def set_fault_light():
miner = await get_miner("192.168.1.20")
# get config
cfg = await miner.get_config()
# send config
await miner.send_config(cfg)
if __name__ == "__main__":
asyncio.run(set_fault_light())
```
---
## Settings
`pyasic` has settings designed to make using large groups of miners easier. You can set the default password for all types of miners using the [`pyasic.settings`][pyasic.settings] module, used as follows:
---
`pyasic` has settings designed to make using large groups of miners easier. You can set the default password for all types of miners using the `pyasic.settings` module, used as follows:
```python
from pyasic import settings
@@ -298,18 +224,23 @@ from pyasic import settings
settings.update("default_antminer_password", "my_pwd")
```
Here are of all the settings, and their default values:
##### Default values:
```
"network_ping_retries": 1,
"network_ping_timeout": 3,
"network_scan_threads": 300,
"factory_get_retries": 1,
"factory_get_timeout": 3,
"get_data_retries": 1,
"api_function_timeout": 5,
"default_whatsminer_password": "admin",
"default_innosilicon_password": "admin",
"default_antminer_password": "root",
"default_bosminer_password": "root",
"default_vnish_password": "admin",
"default_epic_password": "letmein",
"default_goldshell_password": "123456789",
# ADVANCED
# Only use this if you know what you are doing
"socket_linger_time": 1000,
```

91
docs/miners/functions.md Normal file
View File

@@ -0,0 +1,91 @@
## Control functionality
### Check Light
::: pyasic.miners.BaseMiner.check_light
handler: python
options:
heading_level: 4
### Fault Light Off
::: pyasic.miners.BaseMiner.fault_light_off
handler: python
options:
heading_level: 4
### Fault Light On
::: pyasic.miners.BaseMiner.fault_light_on
handler: python
options:
heading_level: 4
### Get Config
::: pyasic.miners.BaseMiner.get_config
handler: python
options:
heading_level: 4
### Get Data
::: pyasic.miners.BaseMiner.get_data
handler: python
options:
heading_level: 4
### Get Errors
::: pyasic.miners.BaseMiner.get_errors
handler: python
options:
heading_level: 4
### Get Hostname
::: pyasic.miners.BaseMiner.get_hostname
handler: python
options:
heading_level: 4
### Get Model
::: pyasic.miners.BaseMiner.get_model
handler: python
options:
heading_level: 4
### Reboot
::: pyasic.miners.BaseMiner.reboot
handler: python
options:
heading_level: 4
### Restart Backend
::: pyasic.miners.BaseMiner.restart_backend
handler: python
options:
heading_level: 4
### Stop Mining
::: pyasic.miners.BaseMiner.stop_mining
handler: python
options:
heading_level: 4
### Resume Mining
::: pyasic.miners.BaseMiner.resume_mining
handler: python
options:
heading_level: 4
### Is Mining
::: pyasic.miners.BaseMiner.is_mining
handler: python
options:
heading_level: 4
### Send Config
::: pyasic.miners.BaseMiner.send_config
handler: python
options:
heading_level: 4
### Set Power Limit
::: pyasic.miners.BaseMiner.set_power_limit
handler: python
options:
heading_level: 4

View File

@@ -4,6 +4,7 @@ nav:
- Introduction: "index.md"
- Miners:
- Supported Miners: "miners/supported_types.md"
- Standard Functionality: "miners/functions.md"
- Miner Factory: "miners/miner_factory.md"
- Network:
- Miner Network: "network/miner_network.md"

View File

@@ -213,11 +213,11 @@ If you are sure you want to use this command please use API.send_command("{comma
# append that data if there is more, and then onto the main loop.
# the password timeout might need to be longer than 1, but it seems to work for now.
ret_data = await asyncio.wait_for(reader.read(1), timeout=1)
except (asyncio.TimeoutError):
except asyncio.TimeoutError:
return b"{}"
try:
ret_data += await asyncio.wait_for(reader.read(4096), timeout=timeout)
except (ConnectionAbortedError):
except ConnectionAbortedError:
return b"{}"
# loop to receive all the data

View File

@@ -487,6 +487,34 @@ class BTMinerAPI(BaseMinerAPI):
"""
return await self.send_privileged_command("set_low_power")
async def set_high_power(self) -> dict:
"""Set low power mode on the miner using the API.
<details>
<summary>Expand</summary>
Set low power mode on the miner using the API, only works after
changing the password of the miner using the Whatsminer tool.
Returns:
A reply informing of the status of setting low power mode.
</details>
"""
return await self.send_privileged_command("set_high_power")
async def set_normal_power(self) -> dict:
"""Set low power mode on the miner using the API.
<details>
<summary>Expand</summary>
Set low power mode on the miner using the API, only works after
changing the password of the miner using the Whatsminer tool.
Returns:
A reply informing of the status of setting low power mode.
</details>
"""
return await self.send_privileged_command("set_normal_power")
async def update_firmware(self): # noqa - static
"""Not implemented."""
# to be determined if this will be added later

View File

@@ -13,664 +13,161 @@
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from copy import deepcopy
from dataclasses import asdict, dataclass, field
import logging
import random
import string
import time
from dataclasses import asdict, dataclass, fields
from enum import IntEnum
from typing import List, Literal
import toml
import yaml
class X19PowerMode(IntEnum):
Normal = 0
Sleep = 1
LPM = 3
@dataclass
class _Pool:
"""A dataclass for pool information.
Attributes:
url: URL of the pool.
username: Username on the pool.
password: Worker password on the pool.
"""
url: str = ""
username: str = ""
password: str = ""
@classmethod
def fields(cls):
return fields(cls)
def from_dict(self, data: dict):
"""Convert raw pool data as a dict to usable data and save it to this class.
Parameters:
data: The raw config data to convert.
"""
for key in data.keys():
if key == "url":
self.url = data[key]
if key in ["user", "username"]:
self.username = data[key]
if key in ["pass", "password"]:
self.password = data[key]
return self
def as_wm(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a dict usable by an Whatsminer device.
Parameters:
user_suffix: The suffix to append to username.
"""
username = self.username
if user_suffix:
username = f"{username}{user_suffix}"
pool = {"url": self.url, "user": username, "pass": self.password}
return pool
def as_x19(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a dict usable by an X19 device.
Parameters:
user_suffix: The suffix to append to username.
"""
username = self.username
if user_suffix:
username = f"{username}{user_suffix}"
pool = {"url": self.url, "user": username, "pass": self.password}
return pool
def as_x17(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a dict usable by an X5 device.
Parameters:
user_suffix: The suffix to append to username.
"""
username = self.username
if user_suffix:
username = f"{username}{user_suffix}"
pool = {"url": self.url, "user": username, "pass": self.password}
return pool
def as_goldshell(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a dict usable by a goldshell device.
Parameters:
user_suffix: The suffix to append to username.
"""
username = self.username
if user_suffix:
username = f"{username}{user_suffix}"
pool = {"url": self.url, "user": username, "pass": self.password}
return pool
def as_inno(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a dict usable by an Innosilicon device.
Parameters:
user_suffix: The suffix to append to username.
"""
username = self.username
if user_suffix:
username = f"{username}{user_suffix}"
pool = {
f"Pool": self.url,
f"UserName": username,
f"Password": self.password,
}
return pool
def as_avalon(self, user_suffix: str = None) -> str:
"""Convert the data in this class to a string usable by an Avalonminer device.
Parameters:
user_suffix: The suffix to append to username.
"""
username = self.username
if user_suffix:
username = f"{username}{user_suffix}"
pool = ",".join([self.url, username, self.password])
return pool
def as_bos(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a dict usable by an BOSMiner device.
Parameters:
user_suffix: The suffix to append to username.
"""
username = self.username
if user_suffix:
username = f"{username}{user_suffix}"
pool = {"url": self.url, "user": username, "password": self.password}
return pool
@dataclass
class _PoolGroup:
"""A dataclass for pool group information.
Attributes:
quota: The group quota.
group_name: The name of the pool group.
pools: A list of pools in this group.
"""
quota: int = 1
group_name: str = None
pools: List[_Pool] = None
@classmethod
def fields(cls):
return fields(cls)
def __post_init__(self):
if not self.group_name:
self.group_name = "".join(
random.choice(string.ascii_uppercase + string.digits) for _ in range(6)
) # generate random pool group name in case it isn't set
def from_dict(self, data: dict):
"""Convert raw pool group data as a dict to usable data and save it to this class.
Parameters:
data: The raw config data to convert.
"""
pools = []
for key in data.keys():
if key in ["name", "group_name"]:
self.group_name = data[key]
if key == "quota":
self.quota = data[key]
if key in ["pools", "pool"]:
for pool in data[key]:
pools.append(_Pool().from_dict(pool))
self.pools = pools
return self
def as_x19(self, user_suffix: str = None) -> List[dict]:
"""Convert the data in this class to a list usable by an X19 device.
Parameters:
user_suffix: The suffix to append to username.
"""
pools = []
for pool in self.pools[:3]:
pools.append(pool.as_x19(user_suffix=user_suffix))
return pools
def as_x17(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a list usable by an X17 device.
Parameters:
user_suffix: The suffix to append to username.
"""
pools = {
"_ant_pool1url": "",
"_ant_pool1user": "",
"_ant_pool1pw": "",
"_ant_pool2url": "",
"_ant_pool2user": "",
"_ant_pool2pw": "",
"_ant_pool3url": "",
"_ant_pool3user": "",
"_ant_pool3pw": "",
}
for idx, pool in enumerate(self.pools[:3]):
pools[f"_ant_pool{idx+1}url"] = pool.as_x17(user_suffix=user_suffix)["url"]
pools[f"_ant_pool{idx+1}user"] = pool.as_x17(user_suffix=user_suffix)[
"user"
]
pools[f"_ant_pool{idx+1}pw"] = pool.as_x17(user_suffix=user_suffix)["pass"]
return pools
def as_goldshell(self, user_suffix: str = None) -> list:
"""Convert the data in this class to a list usable by a goldshell device.
Parameters:
user_suffix: The suffix to append to username.
"""
return [pool.as_goldshell(user_suffix=user_suffix) for pool in self.pools[:3]]
def as_inno(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a list usable by an Innosilicon device.
Parameters:
user_suffix: The suffix to append to username.
"""
pools = {
"Pool1": None,
"UserName1": None,
"Password1": None,
"Pool2": None,
"UserName2": None,
"Password2": None,
"Pool3": None,
"UserName3": None,
"Password3": None,
}
for idx, pool in enumerate(self.pools[:3]):
pool_data = pool.as_inno(user_suffix=user_suffix)
for key in pool_data:
pools[f"{key}{idx+1}"] = pool_data[key]
return pools
def as_wm(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a list usable by a Whatsminer device.
Parameters:
user_suffix: The suffix to append to username.
"""
pools = {}
for i in range(1, 4):
if i <= len(self.pools):
pool_wm = self.pools[i - 1].as_wm(user_suffix)
pools[f"pool_{i}"] = pool_wm["url"]
pools[f"worker_{i}"] = pool_wm["user"]
pools[f"passwd_{i}"] = pool_wm["pass"]
else:
pools[f"pool_{i}"] = ""
pools[f"worker_{i}"] = ""
pools[f"passwd_{i}"] = ""
return pools
def as_avalon(self, user_suffix: str = None) -> str:
"""Convert the data in this class to a dict usable by an Avalonminer device.
Parameters:
user_suffix: The suffix to append to username.
"""
pool = self.pools[0].as_avalon(user_suffix=user_suffix)
return pool
def as_bos(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a dict usable by an BOSMiner device.
Parameters:
user_suffix: The suffix to append to username.
"""
group = {
"name": self.group_name,
"quota": self.quota,
"pool": [pool.as_bos(user_suffix=user_suffix) for pool in self.pools],
}
return group
from pyasic.config.fans import FanModeConfig
from pyasic.config.mining import MiningModeConfig
from pyasic.config.pools import PoolConfig
from pyasic.config.power_scaling import PowerScalingConfig, PowerScalingShutdown
from pyasic.config.temperature import TemperatureConfig
@dataclass
class MinerConfig:
"""A dataclass for miner configuration information.
Attributes:
pool_groups: A list of pool groups in this config.
temp_mode: The temperature control mode.
temp_target: The target temp.
temp_hot: The hot temp (100% fans).
temp_dangerous: The dangerous temp (shutdown).
minimum_fans: The minimum numbers of fans to run the miner.
fan_speed: Manual fan speed to run the fan at (only if temp_mode == "manual").
asicboost: Whether or not to enable asicboost.
autotuning_enabled: Whether or not to enable autotuning.
autotuning_mode: Autotuning mode, either "wattage" or "hashrate".
autotuning_wattage: The wattage to use when autotuning.
autotuning_hashrate: The hashrate to use when autotuning.
dps_enabled: Whether or not to enable dynamic power scaling.
dps_power_step: The amount of power to reduce autotuning by when the miner reaches dangerous temp.
dps_min_power: The minimum power to reduce autotuning to.
dps_shutdown_enabled: Whether or not to shutdown the miner when `dps_min_power` is reached.
dps_shutdown_duration: The amount of time to shutdown for (in hours).
"""
pool_groups: List[_PoolGroup] = None
temp_mode: Literal["auto", "manual", "disabled"] = "auto"
temp_target: float = 70.0
temp_hot: float = 80.0
temp_dangerous: float = 100.0
minimum_fans: int = None
fan_speed: Literal[tuple(range(101))] = None # noqa - Ignore weird Literal usage
asicboost: bool = None
miner_mode: IntEnum = X19PowerMode.Normal
autotuning_enabled: bool = True
autotuning_mode: Literal["power", "hashrate"] = None
autotuning_wattage: int = None
autotuning_hashrate: int = None
dps_enabled: bool = None
dps_power_step: int = None
dps_min_power: int = None
dps_shutdown_enabled: bool = None
dps_shutdown_duration: float = None
@classmethod
def fields(cls):
return fields(cls)
pools: PoolConfig = field(default_factory=PoolConfig.default)
fan_mode: FanModeConfig = field(default_factory=FanModeConfig.default)
temperature: TemperatureConfig = field(default_factory=TemperatureConfig.default)
mining_mode: MiningModeConfig = field(default_factory=MiningModeConfig.default)
power_scaling: PowerScalingConfig = field(
default_factory=PowerScalingConfig.default
)
def as_dict(self) -> dict:
"""Convert the data in this class to a dict."""
logging.debug(f"MinerConfig - (To Dict) - Dumping Dict config")
data_dict = asdict(self)
for key in asdict(self).keys():
if isinstance(data_dict[key], IntEnum):
data_dict[key] = data_dict[key].value
if data_dict[key] is None:
del data_dict[key]
return data_dict
return asdict(self)
def as_toml(self) -> str:
"""Convert the data in this class to toml."""
logging.debug(f"MinerConfig - (To TOML) - Dumping TOML config")
return toml.dumps(self.as_dict())
def as_yaml(self) -> str:
"""Convert the data in this class to yaml."""
logging.debug(f"MinerConfig - (To YAML) - Dumping YAML config")
return yaml.dump(self.as_dict(), sort_keys=False)
def from_raw(self, data: dict):
"""Convert raw config data as a dict to usable data and save it to this class.
This should be able to handle any raw config file from any miner supported by pyasic.
Parameters:
data: The raw config data to convert.
"""
logging.debug(f"MinerConfig - (From Raw) - Loading raw config")
pool_groups = []
if isinstance(data, list):
# goldshell config list
data = {"pools": data}
for key in data.keys():
if key == "pools":
pool_groups.append(_PoolGroup().from_dict({"pools": data[key]}))
elif key == "group":
for group in data[key]:
pool_groups.append(_PoolGroup().from_dict(group))
if key == "bitmain-fan-ctrl":
if data[key]:
self.temp_mode = "manual"
if data.get("bitmain-fan-pwm"):
self.fan_speed = int(data["bitmain-fan-pwm"])
elif key == "bitmain-work-mode":
if data[key]:
self.miner_mode = X19PowerMode(int(data[key]))
elif key == "fan_control":
for _key in data[key]:
if _key == "min_fans":
self.minimum_fans = data[key][_key]
elif _key == "speed":
self.fan_speed = data[key][_key]
elif key == "temp_control":
for _key in data[key]:
if _key == "mode":
self.temp_mode = data[key][_key]
elif _key == "target_temp":
self.temp_target = data[key][_key]
elif _key == "hot_temp":
self.temp_hot = data[key][_key]
elif _key == "dangerous_temp":
self.temp_dangerous = data[key][_key]
if key == "hash_chain_global":
if data[key].get("asic_boost"):
self.asicboost = data[key]["asic_boost"]
if key == "autotuning":
for _key in data[key]:
if _key == "enabled":
self.autotuning_enabled = data[key][_key]
elif _key == "psu_power_limit":
self.autotuning_wattage = data[key][_key]
elif _key == "power_target":
self.autotuning_wattage = data[key][_key]
elif _key == "hashrate_target":
self.autotuning_hashrate = data[key][_key]
elif _key == "mode":
self.autotuning_mode = data[key][_key].replace("_target", "")
if key in ["power_scaling", "performance_scaling"]:
for _key in data[key]:
if _key == "enabled":
self.dps_enabled = data[key][_key]
elif _key == "power_step":
self.dps_power_step = data[key][_key]
elif _key in ["min_psu_power_limit", "min_power_target"]:
self.dps_min_power = data[key][_key]
elif _key == "shutdown_enabled":
self.dps_shutdown_enabled = data[key][_key]
elif _key == "shutdown_duration":
self.dps_shutdown_duration = data[key][_key]
self.pool_groups = pool_groups
return self
def from_api(self, pools: list):
"""Convert list output from the `AnyMiner.api.pools()` command into a usable data and save it to this class.
Parameters:
pools: The list of pool data to convert.
"""
logging.debug(f"MinerConfig - (From API) - Loading API config")
_pools = []
for pool in pools:
url = pool.get("URL")
user = pool.get("User")
_pools.append({"url": url, "user": user, "pass": "123"})
self.pool_groups = [_PoolGroup().from_dict({"pools": _pools})]
return self
def from_dict(self, data: dict):
"""Convert an output dict of this class back into usable data and save it to this class.
Parameters:
data: The dict config data to convert.
"""
logging.debug(f"MinerConfig - (From Dict) - Loading Dict config")
pool_groups = []
for group in data["pool_groups"]:
pool_groups.append(_PoolGroup().from_dict(group))
for key in data:
if (
hasattr(self, key)
and not key == "pool_groups"
and not key == "miner_mode"
):
setattr(self, key, data[key])
if key == "miner_mode":
self.miner_mode = X19PowerMode(data[key])
self.pool_groups = pool_groups
return self
def from_toml(self, data: str):
"""Convert output toml of this class back into usable data and save it to this class.
Parameters:
data: The toml config data to convert.
"""
logging.debug(f"MinerConfig - (From TOML) - Loading TOML config")
return self.from_dict(toml.loads(data))
def from_yaml(self, data: str):
"""Convert output yaml of this class back into usable data and save it to this class.
Parameters:
data: The yaml config data to convert.
"""
logging.debug(f"MinerConfig - (From YAML) - Loading YAML config")
return self.from_dict(yaml.load(data, Loader=yaml.SafeLoader))
def as_am_modern(self, user_suffix: str = None) -> dict:
return {
**self.fan_mode.as_am_modern(),
"freq-level": "100",
**self.mining_mode.as_am_modern(),
**self.pools.as_am_modern(user_suffix=user_suffix),
**self.temperature.as_am_modern(),
**self.power_scaling.as_am_modern(),
}
def as_wm(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a config usable by a Whatsminer device.
Parameters:
user_suffix: The suffix to append to username.
"""
logging.debug(f"MinerConfig - (As Whatsminer) - Generating Whatsminer config")
return {
"pools": self.pool_groups[0].as_wm(user_suffix=user_suffix),
"wattage": self.autotuning_wattage,
**self.fan_mode.as_wm(),
**self.mining_mode.as_wm(),
**self.pools.as_wm(user_suffix=user_suffix),
**self.temperature.as_wm(),
**self.power_scaling.as_wm(),
}
def as_am_old(self, user_suffix: str = None) -> dict:
return {
**self.fan_mode.as_am_old(),
**self.mining_mode.as_am_old(),
**self.pools.as_am_old(user_suffix=user_suffix),
**self.temperature.as_am_old(),
**self.power_scaling.as_am_old(),
}
def as_goldshell(self, user_suffix: str = None) -> dict:
return {
**self.fan_mode.as_goldshell(),
**self.mining_mode.as_goldshell(),
**self.pools.as_goldshell(user_suffix=user_suffix),
**self.temperature.as_goldshell(),
**self.power_scaling.as_goldshell(),
}
def as_avalon(self, user_suffix: str = None) -> dict:
return {
**self.fan_mode.as_avalon(),
**self.mining_mode.as_avalon(),
**self.pools.as_avalon(user_suffix=user_suffix),
**self.temperature.as_avalon(),
**self.power_scaling.as_avalon(),
}
def as_inno(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a config usable by an Innosilicon device.
Parameters:
user_suffix: The suffix to append to username.
"""
logging.debug(f"MinerConfig - (As Inno) - Generating Innosilicon config")
return self.pool_groups[0].as_inno(user_suffix=user_suffix)
def as_x19(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a config usable by an X19 device.
Parameters:
user_suffix: The suffix to append to username.
"""
logging.debug(f"MinerConfig - (As X19) - Generating X19 config")
cfg = {
"bitmain-fan-ctrl": False,
"bitmain-fan-pwn": "100",
"freq-level": "100",
"miner-mode": str(self.miner_mode.value),
"pools": self.pool_groups[0].as_x19(user_suffix=user_suffix),
return {
**self.fan_mode.as_inno(),
**self.mining_mode.as_inno(),
**self.pools.as_inno(user_suffix=user_suffix),
**self.temperature.as_inno(),
**self.power_scaling.as_inno(),
}
if not self.temp_mode == "auto":
cfg["bitmain-fan-ctrl"] = True
if self.fan_speed:
cfg["bitmain-fan-pwn"] = str(self.fan_speed)
return cfg
def as_x17(self, user_suffix: str = None) -> dict:
"""Convert the data in this class to a config usable by an X5 device.
Parameters:
user_suffix: The suffix to append to username.
"""
cfg = self.pool_groups[0].as_x17(user_suffix=user_suffix)
return cfg
def as_goldshell(self, user_suffix: str = None) -> list:
"""Convert the data in this class to a config usable by a goldshell device.
Parameters:
user_suffix: The suffix to append to username.
"""
cfg = self.pool_groups[0].as_goldshell(user_suffix=user_suffix)
return cfg
def as_avalon(self, user_suffix: str = None) -> str:
"""Convert the data in this class to a config usable by an Avalonminer device.
Parameters:
user_suffix: The suffix to append to username.
"""
logging.debug(f"MinerConfig - (As Avalon) - Generating AvalonMiner config")
cfg = self.pool_groups[0].as_avalon(user_suffix=user_suffix)
return cfg
def as_bos(self, model: str = "S9", user_suffix: str = None) -> str:
"""Convert the data in this class to a config usable by an BOSMiner device.
Parameters:
model: The model of the miner to be used in the format portion of the config.
user_suffix: The suffix to append to username.
"""
logging.debug(f"MinerConfig - (As BOS) - Generating BOSMiner config")
cfg = {
"format": {
"version": "1.2+",
"model": f"Antminer {model.replace('j', 'J')}",
"generator": "pyasic",
"timestamp": int(time.time()),
},
"group": [
group.as_bos(user_suffix=user_suffix) for group in self.pool_groups
],
"temp_control": {
"mode": self.temp_mode,
"target_temp": self.temp_target,
"hot_temp": self.temp_hot,
"dangerous_temp": self.temp_dangerous,
},
def as_bosminer(self, user_suffix: str = None) -> dict:
return {
**merge(self.fan_mode.as_bosminer(), self.temperature.as_bosminer()),
**self.mining_mode.as_bosminer(),
**self.pools.as_bosminer(user_suffix=user_suffix),
**self.power_scaling.as_bosminer(),
}
if self.autotuning_enabled or self.autotuning_wattage:
cfg["autotuning"] = {}
if self.autotuning_enabled:
cfg["autotuning"]["enabled"] = True
else:
cfg["autotuning"]["enabled"] = False
if self.autotuning_mode:
cfg["format"]["version"] = "2.0"
cfg["autotuning"]["mode"] = self.autotuning_mode + "_target"
if self.autotuning_wattage:
cfg["autotuning"]["power_target"] = self.autotuning_wattage
elif self.autotuning_hashrate:
cfg["autotuning"]["hashrate_target"] = self.autotuning_hashrate
else:
if self.autotuning_wattage:
cfg["autotuning"]["psu_power_limit"] = self.autotuning_wattage
def as_bos_grpc(self, user_suffix: str = None) -> dict:
return {
**self.fan_mode.as_bos_grpc(),
**self.temperature.as_bos_grpc(),
**self.mining_mode.as_bos_grpc(),
**self.pools.as_bos_grpc(user_suffix=user_suffix),
**self.power_scaling.as_bos_grpc(),
}
if self.asicboost:
cfg["hash_chain_global"] = {}
cfg["hash_chain_global"]["asic_boost"] = self.asicboost
def as_epic(self, user_suffix: str = None) -> dict:
return {
**self.fan_mode.as_epic(),
**self.temperature.as_epic(),
**self.mining_mode.as_epic(),
**self.pools.as_epic(user_suffix=user_suffix),
**self.power_scaling.as_epic(),
}
if self.minimum_fans is not None or self.fan_speed is not None:
cfg["fan_control"] = {}
if self.minimum_fans is not None:
cfg["fan_control"]["min_fans"] = self.minimum_fans
if self.fan_speed is not None:
cfg["fan_control"]["speed"] = self.fan_speed
@classmethod
def from_dict(cls, dict_conf: dict) -> "MinerConfig":
return cls(
pools=PoolConfig.from_dict(dict_conf.get("pools")),
mining_mode=MiningModeConfig.from_dict(dict_conf.get("mining_mode")),
fan_mode=FanModeConfig.from_dict(dict_conf.get("fan_mode")),
temperature=TemperatureConfig.from_dict(dict_conf.get("temperature")),
power_scaling=PowerScalingConfig.from_dict(dict_conf.get("power_scaling")),
)
if any(
[
getattr(self, item)
for item in [
"dps_enabled",
"dps_power_step",
"dps_min_power",
"dps_shutdown_enabled",
"dps_shutdown_duration",
]
]
):
cfg["power_scaling"] = {}
if self.dps_enabled:
cfg["power_scaling"]["enabled"] = self.dps_enabled
if self.dps_power_step:
cfg["power_scaling"]["power_step"] = self.dps_power_step
if self.dps_min_power:
if cfg["format"]["version"] == "2.0":
cfg["power_scaling"]["min_power_target"] = self.dps_min_power
else:
cfg["power_scaling"]["min_psu_power_limit"] = self.dps_min_power
if self.dps_shutdown_enabled:
cfg["power_scaling"]["shutdown_enabled"] = self.dps_shutdown_enabled
if self.dps_shutdown_duration:
cfg["power_scaling"]["shutdown_duration"] = self.dps_shutdown_duration
@classmethod
def from_api(cls, api_pools: dict) -> "MinerConfig":
return cls(pools=PoolConfig.from_api(api_pools))
return toml.dumps(cfg)
@classmethod
def from_am_modern(cls, web_conf: dict) -> "MinerConfig":
return cls(
pools=PoolConfig.from_am_modern(web_conf),
mining_mode=MiningModeConfig.from_am_modern(web_conf),
fan_mode=FanModeConfig.from_am_modern(web_conf),
)
@classmethod
def from_am_old(cls, web_conf: dict) -> "MinerConfig":
return cls.from_am_modern(web_conf)
@classmethod
def from_goldshell(cls, web_conf: dict) -> "MinerConfig":
return cls(pools=PoolConfig.from_am_modern(web_conf))
@classmethod
def from_inno(cls, web_pools: list) -> "MinerConfig":
return cls(pools=PoolConfig.from_inno(web_pools))
@classmethod
def from_bosminer(cls, toml_conf: dict) -> "MinerConfig":
return cls(
pools=PoolConfig.from_bosminer(toml_conf),
mining_mode=MiningModeConfig.from_bosminer(toml_conf),
fan_mode=FanModeConfig.from_bosminer(toml_conf),
temperature=TemperatureConfig.from_bosminer(toml_conf),
power_scaling=PowerScalingConfig.from_bosminer(toml_conf),
)
def merge(a: dict, b: dict) -> dict:
result = deepcopy(a)
for b_key, b_val in b.items():
a_val = result.get(b_key)
if isinstance(a_val, dict) and isinstance(b_val, dict):
result[b_key] = merge(a_val, b_val)
else:
result[b_key] = deepcopy(b_val)
return result

95
pyasic/config/base.py Normal file
View File

@@ -0,0 +1,95 @@
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from dataclasses import asdict, dataclass
from enum import Enum
from typing import Union
class MinerConfigOption(Enum):
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]):
return cls.default()
def as_am_modern(self) -> dict:
return self.value.as_am_modern()
def as_am_old(self) -> dict:
return self.value.as_am_old()
def as_wm(self) -> dict:
return self.value.as_wm()
def as_inno(self) -> dict:
return self.value.as_inno()
def as_goldshell(self) -> dict:
return self.value.as_goldshell()
def as_avalon(self) -> dict:
return self.value.as_avalon()
def as_bosminer(self) -> dict:
return self.value.as_bosminer()
def as_bos_grpc(self) -> dict:
return self.value.as_bos_grpc()
def as_epic(self) -> dict:
return self.value.as_epic()
def __call__(self, *args, **kwargs):
return self.value(*args, **kwargs)
@classmethod
def default(cls):
pass
@dataclass
class MinerConfigValue:
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]):
return cls()
def as_dict(self):
return asdict(self)
def as_am_modern(self) -> dict:
return {}
def as_am_old(self) -> dict:
return {}
def as_wm(self) -> dict:
return {}
def as_inno(self) -> dict:
return {}
def as_goldshell(self) -> dict:
return {}
def as_avalon(self) -> dict:
return {}
def as_bosminer(self) -> dict:
return {}
def as_bos_grpc(self) -> dict:
return {}
def as_epic(self) -> dict:
return {}

134
pyasic/config/fans.py Normal file
View File

@@ -0,0 +1,134 @@
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from dataclasses import dataclass, field
from typing import Union
from pyasic.config.base import MinerConfigOption, MinerConfigValue
@dataclass
class FanModeNormal(MinerConfigValue):
mode: str = field(init=False, default="normal")
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "FanModeNormal":
return cls()
def as_am_modern(self) -> dict:
return {"bitmain-fan-ctrl": False, "bitmain-fan-pwn": "100"}
def as_bosminer(self) -> dict:
return {"temp_control": {"mode": "auto"}}
@dataclass
class FanModeManual(MinerConfigValue):
mode: str = field(init=False, default="manual")
minimum_fans: int = 1
speed: int = 100
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "FanModeManual":
cls_conf = {}
if dict_conf.get("min_fans") is not None:
cls_conf["minimum_fans"] = dict_conf["minimum_fans"]
if dict_conf.get("speed") is not None:
cls_conf["speed"] = dict_conf["speed"]
return cls(**cls_conf)
@classmethod
def from_bosminer(cls, toml_fan_conf: dict) -> "FanModeManual":
cls_conf = {}
if toml_fan_conf.get("min_fans") is not None:
cls_conf["minimum_fans"] = toml_fan_conf["min_fans"]
if toml_fan_conf.get("speed") is not None:
cls_conf["speed"] = toml_fan_conf["speed"]
return cls(**cls_conf)
def as_am_modern(self) -> dict:
return {"bitmain-fan-ctrl": True, "bitmain-fan-pwn": str(self.speed)}
def as_bosminer(self) -> dict:
return {
"temp_control": {"mode": "manual"},
"fan_control": {"min_fans": self.minimum_fans, "speed": self.speed},
}
@dataclass
class FanModeImmersion(MinerConfigValue):
mode: str = field(init=False, default="immersion")
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "FanModeImmersion":
return cls()
def as_am_modern(self) -> dict:
return {"bitmain-fan-ctrl": True, "bitmain-fan-pwn": "0"}
def as_bosminer(self) -> dict:
return {"temp_control": {"mode": "disabled"}}
class FanModeConfig(MinerConfigOption):
normal = FanModeNormal
manual = FanModeManual
immersion = FanModeImmersion
@classmethod
def default(cls):
return cls.normal()
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]):
if dict_conf is None:
return cls.default()
mode = dict_conf.get("mode")
if mode is None:
return cls.default()
clsattr = getattr(cls, mode)
if clsattr is not None:
return clsattr().from_dict(dict_conf)
@classmethod
def from_am_modern(cls, web_conf: dict):
if web_conf.get("bitmain-fan-ctrl") is not None:
fan_manual = web_conf["bitmain-fan-ctrl"]
if fan_manual:
return cls.manual(speed=web_conf["bitmain-fan-pwm"])
else:
return cls.normal()
else:
return cls.default()
@classmethod
def from_bosminer(cls, toml_conf: dict):
if toml_conf.get("temp_control") is None:
return cls.default()
if toml_conf["temp_control"].get("mode") is None:
return cls.default()
mode = toml_conf["temp_control"]["mode"]
if mode == "auto":
return cls.normal()
elif mode == "manual":
if toml_conf.get("fan_control"):
return cls.manual().from_bosminer(toml_conf["fan_control"])
return cls.manual()
elif mode == "disabled":
return cls.immersion()

213
pyasic/config/mining.py Normal file
View File

@@ -0,0 +1,213 @@
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from dataclasses import dataclass, field
from typing import Union
from pyasic.config.base import MinerConfigOption, MinerConfigValue
@dataclass
class MiningModeNormal(MinerConfigValue):
mode: str = field(init=False, default="normal")
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "MiningModeNormal":
return cls()
def as_am_modern(self) -> dict:
return {"miner-mode": "0"}
def as_wm(self) -> dict:
return {"mode": self.mode}
@dataclass
class MiningModeSleep(MinerConfigValue):
mode: str = field(init=False, default="sleep")
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "MiningModeSleep":
return cls()
def as_am_modern(self) -> dict:
return {"miner-mode": "1"}
def as_wm(self) -> dict:
return {"mode": self.mode}
@dataclass
class MiningModeLPM(MinerConfigValue):
mode: str = field(init=False, default="low")
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "MiningModeLPM":
return cls()
def as_am_modern(self) -> dict:
return {"miner-mode": "3"}
def as_wm(self) -> dict:
return {"mode": self.mode}
@dataclass
class MiningModeHPM(MinerConfigValue):
mode: str = field(init=False, default="high")
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "MiningModeHPM":
return cls()
def as_am_modern(self):
return {"miner-mode": "0"}
def as_wm(self) -> dict:
return {"mode": self.mode}
@dataclass
class MiningModePowerTune(MinerConfigValue):
mode: str = field(init=False, default="power_tuning")
power: int = None
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "MiningModePowerTune":
return cls(dict_conf.get("power"))
def as_am_modern(self) -> dict:
return {"miner-mode": "0"}
def as_wm(self) -> dict:
if self.power is not None:
return {"mode": self.mode, self.mode: {"wattage": self.power}}
return {}
def as_bosminer(self) -> dict:
return {"autotuning": {"enabled": True, "psu_power_limit": self.power}}
@dataclass
class MiningModeHashrateTune(MinerConfigValue):
mode: str = field(init=False, default="hashrate_tuning")
hashrate: int = None
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "MiningModeHashrateTune":
return cls(dict_conf.get("hashrate"))
def as_am_modern(self) -> dict:
return {"miner-mode": "0"}
@dataclass
class ManualBoardSettings(MinerConfigValue):
freq: float
volt: float
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "ManualBoardSettings":
return cls(freq=dict_conf["freq"], volt=dict_conf["volt"])
def as_am_modern(self) -> dict:
return {"miner-mode": "0"}
@dataclass
class MiningModeManual(MinerConfigValue):
mode: str = field(init=False, default="manual")
global_freq: float
global_volt: float
boards: dict[int, ManualBoardSettings] = field(default_factory=dict)
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "MiningModeManual":
return cls(
global_freq=dict_conf["global_freq"],
global_volt=dict_conf["global_volt"],
boards={i: ManualBoardSettings.from_dict(dict_conf[i]) for i in dict_conf},
)
def as_am_modern(self) -> dict:
return {"miner-mode": "0"}
class MiningModeConfig(MinerConfigOption):
normal = MiningModeNormal
low = MiningModeLPM
high = MiningModeHPM
sleep = MiningModeSleep
power_tuning = MiningModePowerTune
hashrate_tuning = MiningModeHashrateTune
manual = MiningModeManual
@classmethod
def default(cls):
return cls.normal()
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]):
if dict_conf is None:
return cls.default()
mode = dict_conf.get("mode")
if mode is None:
return cls.default()
clsattr = getattr(cls, mode)
if clsattr is not None:
return clsattr().from_dict(dict_conf)
@classmethod
def from_am_modern(cls, web_conf: dict):
if web_conf.get("bitmain-work-mode") is not None:
work_mode = web_conf["bitmain-work-mode"]
if work_mode == "":
return cls.default()
if int(work_mode) == 0:
return cls.normal()
elif int(work_mode) == 1:
return cls.sleep()
elif int(work_mode) == 3:
return cls.low()
return cls.default()
@classmethod
def from_bosminer(cls, toml_conf: dict):
if toml_conf.get("autotuning") is None:
return cls.default()
autotuning_conf = toml_conf["autotuning"]
if autotuning_conf.get("enabled") is None:
return cls.default()
if not autotuning_conf["enabled"]:
return cls.default()
if autotuning_conf.get("psu_power_limit") is not None:
# old autotuning conf
return cls.power_tuning(autotuning_conf["psu_power_limit"])
if autotuning_conf.get("mode") is not None:
# new autotuning conf
mode = autotuning_conf["mode"]
if mode == "power_target":
if autotuning_conf.get("power_target") is not None:
return cls.power_tuning(autotuning_conf["power_target"])
return cls.power_tuning()
if mode == "hashrate_target":
if autotuning_conf.get("hashrate_target") is not None:
return cls.hashrate_tuning(autotuning_conf["hashrate_target"])
return cls.hashrate_tuning()

356
pyasic/config/pools.py Normal file
View File

@@ -0,0 +1,356 @@
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
import random
import string
from dataclasses import dataclass, field
from typing import Union
from pyasic.config.base import MinerConfigValue
@dataclass
class Pool(MinerConfigValue):
url: str
user: str
password: str
def as_am_modern(self, user_suffix: str = None):
if user_suffix is not None:
return {
"url": self.url,
"user": f"{self.user}{user_suffix}",
"pass": self.password,
}
return {"url": self.url, "user": self.user, "pass": self.password}
def as_wm(self, idx: int, user_suffix: str = None):
if user_suffix is not None:
return {
f"pool_{idx}": self.url,
f"worker_{idx}": f"{self.user}{user_suffix}",
f"passwd_{idx}": self.password,
}
return {
f"pool_{idx}": self.url,
f"worker_{idx}": self.user,
f"passwd_{idx}": self.password,
}
def as_am_old(self, idx: int, user_suffix: str = None):
if user_suffix is not None:
return {
f"_ant_pool{idx}url": self.url,
f"_ant_pool{idx}user": f"{self.user}{user_suffix}",
f"_ant_pool{idx}pw": self.password,
}
return {
f"_ant_pool{idx}url": self.url,
f"_ant_pool{idx}user": self.user,
f"_ant_pool{idx}pw": self.password,
}
def as_goldshell(self, user_suffix: str = None):
if user_suffix is not None:
return {
"url": self.url,
"user": f"{self.user}{user_suffix}",
"pass": self.password,
}
return {"url": self.url, "user": self.user, "pass": self.password}
def as_avalon(self, user_suffix: str = None):
if user_suffix is not None:
return ",".join([self.url, f"{self.user}{user_suffix}", self.password])
return ",".join([self.url, self.user, self.password])
def as_inno(self, idx: int, user_suffix: str = None):
if user_suffix is not None:
return {
f"Pool{idx}": self.url,
f"UserName{idx}": f"{self.user}{user_suffix}",
f"Password{idx}": self.password,
}
return {
f"Pool{idx}": self.url,
f"UserName{idx}": self.user,
f"Password{idx}": self.password,
}
def as_bosminer(self, user_suffix: str = None):
if user_suffix is not None:
return {
"url": self.url,
"user": f"{self.user}{user_suffix}",
"password": self.password,
}
return {"url": self.url, "user": self.user, "password": self.password}
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "Pool":
return cls(
url=dict_conf["url"], user=dict_conf["user"], password=dict_conf["password"]
)
@classmethod
def from_api(cls, api_pool: dict) -> "Pool":
return cls(url=api_pool["URL"], user=api_pool["User"], password="x")
@classmethod
def from_am_modern(cls, web_pool: dict) -> "Pool":
return cls(
url=web_pool["url"], user=web_pool["user"], password=web_pool["pass"]
)
# TODO: check if this is accurate, user/username, pass/password
@classmethod
def from_goldshell(cls, web_pool: dict) -> "Pool":
return cls(
url=web_pool["url"], user=web_pool["user"], password=web_pool["pass"]
)
@classmethod
def from_inno(cls, web_pool: dict) -> "Pool":
return cls(
url=web_pool["url"], user=web_pool["user"], password=web_pool["pass"]
)
@classmethod
def from_bosminer(cls, toml_pool_conf: dict) -> "Pool":
return cls(
url=toml_pool_conf["url"],
user=toml_pool_conf["user"],
password=toml_pool_conf["password"],
)
@dataclass
class PoolGroup(MinerConfigValue):
pools: list[Pool] = field(default_factory=list)
quota: int = 1
name: str = None
def __post_init__(self):
if self.name is None:
self.name = "".join(
random.choice(string.ascii_uppercase + string.digits) for _ in range(6)
) # generate random pool group name in case it isn't set
def as_am_modern(self, user_suffix: str = None) -> list:
pools = []
idx = 0
while idx < 3:
if len(self.pools) > idx:
pools.append(self.pools[idx].as_am_modern(user_suffix=user_suffix))
else:
pools.append(Pool("", "", "").as_am_modern())
idx += 1
return pools
def as_wm(self, user_suffix: str = None) -> dict:
pools = {}
idx = 0
while idx < 3:
if len(self.pools) > idx:
pools.update(
**self.pools[idx].as_wm(idx=idx + 1, user_suffix=user_suffix)
)
else:
pools.update(**Pool("", "", "").as_wm(idx=idx + 1))
idx += 1
return pools
def as_am_old(self, user_suffix: str = None) -> dict:
pools = {}
idx = 0
while idx < 3:
if len(self.pools) > idx:
pools.update(
**self.pools[idx].as_am_old(idx=idx + 1, user_suffix=user_suffix)
)
else:
pools.update(**Pool("", "", "").as_am_old(idx=idx + 1))
idx += 1
return pools
def as_goldshell(self, user_suffix: str = None) -> list:
return [pool.as_goldshell(user_suffix) for pool in self.pools]
def as_avalon(self, user_suffix: str = None) -> dict:
if len(self.pools) > 0:
return self.pools[0].as_avalon(user_suffix=user_suffix)
return Pool("", "", "").as_avalon()
def as_inno(self, user_suffix: str = None) -> dict:
pools = {}
idx = 0
while idx < 3:
if len(self.pools) > idx:
pools.update(
**self.pools[idx].as_inno(idx=idx + 1, user_suffix=user_suffix)
)
else:
pools.update(**Pool("", "", "").as_inno(idx=idx + 1))
idx += 1
return pools
def as_bosminer(self, user_suffix: str = None) -> dict:
if len(self.pools) > 0:
conf = {
"name": self.name,
"pool": [
pool.as_bosminer(user_suffix=user_suffix) for pool in self.pools
],
}
if self.quota is not None:
conf["quota"] = self.quota
return conf
return {"name": "Group", "pool": []}
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "PoolGroup":
cls_conf = {}
if dict_conf.get("quota") is not None:
cls_conf["quota"] = dict_conf["quota"]
if dict_conf.get("name") is not None:
cls_conf["name"] = dict_conf["name"]
cls_conf["pools"] = [Pool.from_dict(p) for p in dict_conf["pools"]]
return cls(**cls_conf)
@classmethod
def from_api(cls, api_pool_list: list) -> "PoolGroup":
pools = []
for pool in api_pool_list:
pools.append(Pool.from_api(pool))
return cls(pools=pools)
@classmethod
def from_am_modern(cls, web_pool_list: list) -> "PoolGroup":
pools = []
for pool in web_pool_list:
pools.append(Pool.from_am_modern(pool))
return cls(pools=pools)
@classmethod
def from_goldshell(cls, web_pools: list) -> "PoolGroup":
return cls([Pool.from_goldshell(p) for p in web_pools])
@classmethod
def from_inno(cls, web_pools: list) -> "PoolGroup":
return cls([Pool.from_inno(p) for p in web_pools])
@classmethod
def from_bosminer(cls, toml_group_conf: dict) -> "PoolGroup":
if toml_group_conf.get("pool") is not None:
return cls(
name=toml_group_conf["name"],
quota=toml_group_conf.get("quota"),
pools=[Pool.from_bosminer(p) for p in toml_group_conf["pool"]],
)
return cls()
@dataclass
class PoolConfig(MinerConfigValue):
groups: list[PoolGroup] = field(default_factory=list)
@classmethod
def default(cls) -> "PoolConfig":
return cls()
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "PoolConfig":
if dict_conf is None:
return cls.default()
return cls(groups=[PoolGroup.from_dict(g) for g in dict_conf["groups"]])
@classmethod
def simple(cls, pools: list[Union[Pool, dict[str, str]]]) -> "PoolConfig":
group_pools = []
for pool in pools:
if isinstance(pool, dict):
pool = Pool(**pool)
group_pools.append(pool)
return cls(groups=[PoolGroup(pools=group_pools)])
def as_am_modern(self, user_suffix: str = None) -> dict:
if len(self.groups) > 0:
return {"pools": self.groups[0].as_am_modern(user_suffix=user_suffix)}
return {"pools": PoolGroup().as_am_modern()}
def as_wm(self, user_suffix: str = None) -> dict:
if len(self.groups) > 0:
return {"pools": self.groups[0].as_wm(user_suffix=user_suffix)}
return {"pools": PoolGroup().as_wm()}
def as_am_old(self, user_suffix: str = None) -> dict:
if len(self.groups) > 0:
return self.groups[0].as_am_old(user_suffix=user_suffix)
return PoolGroup().as_am_old()
def as_goldshell(self, user_suffix: str = None) -> dict:
if len(self.groups) > 0:
return {"pools": self.groups[0].as_goldshell(user_suffix=user_suffix)}
return {"pools": PoolGroup().as_goldshell()}
def as_avalon(self, user_suffix: str = None) -> dict:
if len(self.groups) > 0:
return {"pools": self.groups[0].as_avalon(user_suffix=user_suffix)}
return {"pools": PoolGroup().as_avalon()}
def as_inno(self, user_suffix: str = None) -> dict:
if len(self.groups) > 0:
return self.groups[0].as_inno(user_suffix=user_suffix)
return PoolGroup().as_inno()
def as_bosminer(self, user_suffix: str = None) -> dict:
if len(self.groups) > 0:
return {
"group": [g.as_bosminer(user_suffix=user_suffix) for g in self.groups]
}
return {"group": [PoolGroup().as_bosminer()]}
def as_bos_grpc(self, user_suffix: str = None) -> dict:
return {}
@classmethod
def from_api(cls, api_pools: dict) -> "PoolConfig":
pool_data = api_pools["POOLS"]
pool_data = sorted(pool_data, key=lambda x: int(x["POOL"]))
return cls([PoolGroup.from_api(pool_data)])
@classmethod
def from_am_modern(cls, web_conf: dict) -> "PoolConfig":
pool_data = web_conf["pools"]
return cls([PoolGroup.from_am_modern(pool_data)])
@classmethod
def from_goldshell(cls, web_pools: list) -> "PoolConfig":
return cls([PoolGroup.from_goldshell(web_pools)])
@classmethod
def from_inno(cls, web_pools: list) -> "PoolConfig":
return cls([PoolGroup.from_inno(web_pools)])
@classmethod
def from_bosminer(cls, toml_conf: dict) -> "PoolConfig":
if toml_conf.get("group") is None:
return cls()
return cls([PoolGroup.from_bosminer(g) for g in toml_conf["group"]])

View File

@@ -0,0 +1,189 @@
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from dataclasses import dataclass, field
from typing import Union
from pyasic.config.base import MinerConfigOption, MinerConfigValue
from pyasic.web.bosminer.proto.braiins.bos.v1 import DpsPowerTarget, DpsTarget, Hours
@dataclass
class PowerScalingShutdownEnabled(MinerConfigValue):
mode: str = field(init=False, default="enabled")
duration: int = None
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "PowerScalingShutdownEnabled":
return cls(duration=dict_conf.get("duration"))
def as_bosminer(self) -> dict:
cfg = {"shutdown_enabled": True}
if self.duration is not None:
cfg["shutdown_duration"] = self.duration
return cfg
def as_bos_grpc(self) -> dict:
cfg = {"enable_shutdown ": True}
if self.duration is not None:
cfg["shutdown_duration"] = Hours(self.duration)
return cfg
@dataclass
class PowerScalingShutdownDisabled(MinerConfigValue):
mode: str = field(init=False, default="disabled")
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "PowerScalingShutdownDisabled":
return cls()
def as_bosminer(self) -> dict:
return {"shutdown_enabled": False}
def as_bos_grpc(self) -> dict:
return {"enable_shutdown ": False}
class PowerScalingShutdown(MinerConfigOption):
enabled = PowerScalingShutdownEnabled
disabled = PowerScalingShutdownDisabled
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]):
if dict_conf is None:
return cls.default()
mode = dict_conf.get("mode")
if mode is None:
return cls.default()
clsattr = getattr(cls, mode)
if clsattr is not None:
return clsattr().from_dict(dict_conf)
@classmethod
def from_bosminer(cls, power_scaling_conf: dict):
sd_enabled = power_scaling_conf.get("shutdown_enabled")
if sd_enabled is not None:
if sd_enabled:
return cls.enabled(power_scaling_conf.get("shutdown_duration"))
else:
return cls.disabled()
return None
@dataclass
class PowerScalingEnabled(MinerConfigValue):
mode: str = field(init=False, default="enabled")
power_step: int = None
minimum_power: int = None
shutdown_enabled: Union[
PowerScalingShutdownEnabled, PowerScalingShutdownDisabled
] = None
@classmethod
def from_bosminer(cls, power_scaling_conf: dict) -> "PowerScalingEnabled":
power_step = power_scaling_conf.get("power_step")
min_power = power_scaling_conf.get("min_psu_power_limit")
sd_mode = PowerScalingShutdown.from_bosminer(power_scaling_conf)
return cls(
power_step=power_step, minimum_power=min_power, shutdown_enabled=sd_mode
)
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "PowerScalingEnabled":
cls_conf = {
"power_step": dict_conf.get("power_step"),
"minimum_power": dict_conf.get("minimum_power"),
}
shutdown_enabled = dict_conf.get("shutdown_enabled")
if shutdown_enabled is not None:
cls_conf["shutdown_enabled"] = PowerScalingShutdown.from_dict(
shutdown_enabled
)
return cls(**cls_conf)
def as_bosminer(self) -> dict:
cfg = {"enabled": True}
if self.power_step is not None:
cfg["power_step"] = self.power_step
if self.minimum_power is not None:
cfg["min_psu_power_limit"] = self.minimum_power
if self.shutdown_enabled is not None:
cfg = {**cfg, **self.shutdown_enabled.as_bosminer()}
return {"power_scaling": cfg}
def as_bos_grpc(self) -> dict:
cfg = {"enable": True}
target_conf = {}
if self.power_step is not None:
target_conf["power_step"] = self.power_step
if self.minimum_power is not None:
target_conf["min_power_target"] = self.minimum_power
cfg["target"] = DpsTarget(power_target=DpsPowerTarget(**target_conf))
if self.shutdown_enabled is not None:
cfg = {**cfg, **self.shutdown_enabled.as_bos_grpc()}
return {"dps": cfg}
@dataclass
class PowerScalingDisabled(MinerConfigValue):
mode: str = field(init=False, default="disabled")
class PowerScalingConfig(MinerConfigOption):
enabled = PowerScalingEnabled
disabled = PowerScalingDisabled
@classmethod
def default(cls):
return cls.disabled()
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]):
if dict_conf is None:
return cls.default()
mode = dict_conf.get("mode")
if mode is None:
return cls.default()
clsattr = getattr(cls, mode)
if clsattr is not None:
return clsattr().from_dict(dict_conf)
@classmethod
def from_bosminer(cls, toml_conf: dict):
power_scaling = toml_conf.get("power_scaling")
if power_scaling is not None:
enabled = power_scaling.get("enabled")
if enabled is not None:
if enabled:
return cls.enabled().from_bosminer(power_scaling)
else:
return cls.disabled()
return cls.default()

View File

@@ -0,0 +1,58 @@
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from dataclasses import dataclass
from typing import Union
from pyasic.config.base import MinerConfigValue
@dataclass
class TemperatureConfig(MinerConfigValue):
target: int = None
hot: int = None
danger: int = None
@classmethod
def default(cls):
return cls()
def as_bosminer(self) -> dict:
temp_cfg = {}
if self.target is not None:
temp_cfg["target_temp"] = self.target
if self.hot is not None:
temp_cfg["hot_temp"] = self.hot
if self.danger is not None:
temp_cfg["dangerous_temp"] = self.danger
return {"temp_control": temp_cfg}
@classmethod
def from_dict(cls, dict_conf: Union[dict, None]) -> "TemperatureConfig":
return cls(
target=dict_conf.get("target"),
hot=dict_conf.get("hot"),
danger=dict_conf.get("danger"),
)
@classmethod
def from_bosminer(cls, toml_conf: dict) -> "TemperatureConfig":
temp_control = toml_conf.get("temp_control")
if temp_control is not None:
return cls(
target=temp_control.get("target_temp"),
hot=temp_control.get("hot_temp"),
danger=temp_control.get("dangerous_temp"),
)

View File

@@ -15,7 +15,7 @@
# ------------------------------------------------------------------------------
from pyasic.miners.backends import BOSMiner
from pyasic.miners.types import S19, S19j, S19jNoPIC, S19jPro, S19Pro
from pyasic.miners.types import S19, S19j, S19jNoPIC, S19jPro, S19kProNoPIC, S19Pro
class BOSMinerS19(BOSMiner, S19):
@@ -36,3 +36,7 @@ class BOSMinerS19jNoPIC(BOSMiner, S19jNoPIC):
class BOSMinerS19jPro(BOSMiner, S19jPro):
pass
class BOSMinerS19kProNoPIC(BOSMiner, S19kProNoPIC):
pass

View File

@@ -19,6 +19,7 @@ from .S19 import (
BOSMinerS19j,
BOSMinerS19jNoPIC,
BOSMinerS19jPro,
BOSMinerS19kProNoPIC,
BOSMinerS19Pro,
)
from .T19 import BOSMinerT19

View File

@@ -15,26 +15,32 @@
# ------------------------------------------------------------------------------
from pyasic.miners.backends import ePIC
from pyasic.miners.types import (
S19,
S19Pro,
S19j,
S19jPro,
S19XP,
)
from pyasic.miners.types import S19, S19XP, S19j, S19jPro, S19jProPlus, S19kPro, S19Pro
class ePICS19(ePIC, S19):
pass
class ePICS19Pro(ePIC, S19Pro):
pass
class ePICS19j(ePIC, S19j):
pass
class ePICS19jPro(ePIC, S19jPro):
pass
class ePICS19jProPlus(ePIC, S19jProPlus):
pass
class ePICS19kPro(ePIC, S19kPro):
pass
class ePICS19XP(ePIC, S19XP):
pass

View File

@@ -19,5 +19,7 @@ from .S19 import (
ePICS19Pro,
ePICS19j,
ePICS19jPro,
ePICS19jProPlus,
ePICS19kPro,
ePICS19XP,
)

View File

@@ -25,4 +25,4 @@ from .hiveon import Hiveon
from .luxminer import LUXMiner
from .vnish import VNish
from .epic import ePIC
from .whatsminer import M2X, M3X, M5X
from .whatsminer import M2X, M3X, M5X, M6X

View File

@@ -18,7 +18,7 @@ import asyncio
from typing import List, Optional, Union
from pyasic.API import APIError
from pyasic.config import MinerConfig, X19PowerMode
from pyasic.config import MinerConfig, MiningModeConfig
from pyasic.data import Fan, HashBoard
from pyasic.data.error_codes import MinerErrorData, X19Error
from pyasic.miners.backends.bmminer import BMMiner
@@ -80,23 +80,21 @@ class AntminerModern(BMMiner):
async def get_config(self) -> MinerConfig:
data = await self.web.get_miner_conf()
if data:
self.config = MinerConfig().from_raw(data)
self.config = MinerConfig.from_am_modern(data)
return self.config
async def send_config(self, config: MinerConfig, user_suffix: str = None) -> None:
self.config = config
conf = config.as_x19(user_suffix=user_suffix)
data = await self.web.set_miner_conf(conf)
if data:
if data.get("code") == "M000":
return
for i in range(7):
data = await self.get_config()
if data.as_x19() == conf:
break
await asyncio.sleep(1)
await self.web.set_miner_conf(config.as_am_modern(user_suffix=user_suffix))
# if data:
# if data.get("code") == "M000":
# return
#
# for i in range(7):
# data = await self.get_config()
# if data == self.config:
# break
# await asyncio.sleep(1)
async def fault_light_on(self) -> bool:
data = await self.web.blink(blink=True)
@@ -120,13 +118,13 @@ class AntminerModern(BMMiner):
async def stop_mining(self) -> bool:
cfg = await self.get_config()
cfg.miner_mode = X19PowerMode.Sleep
cfg.miner_mode = MiningModeConfig.sleep
await self.send_config(cfg)
return True
async def resume_mining(self) -> bool:
cfg = await self.get_config()
cfg.miner_mode = X19PowerMode.Normal
cfg.miner_mode = MiningModeConfig.normal
await self.send_config(cfg)
return True
@@ -349,11 +347,12 @@ class AntminerOld(CGMiner):
async def get_config(self) -> MinerConfig:
data = await self.web.get_miner_conf()
if data:
self.config = MinerConfig().from_raw(data)
self.config = MinerConfig.from_am_old(data)
return self.config
async def send_config(self, config: MinerConfig, user_suffix: str = None) -> None:
await self.web.set_miner_conf(config.as_x17(user_suffix=user_suffix))
self.config = config
await self.web.set_miner_conf(config.as_am_old(user_suffix=user_suffix))
async def get_mac(self) -> Union[str, None]:
try:

View File

@@ -72,7 +72,7 @@ class BFGMiner(BaseMiner):
except APIError:
return self.config
self.config = MinerConfig().from_api(pools["POOLS"])
self.config = MinerConfig.from_api(pools)
return self.config
async def send_config(self, config: MinerConfig, user_suffix: str = None) -> None:

View File

@@ -64,7 +64,14 @@ class BFGMinerGoldshell(BFGMiner):
self.data_locations = GOLDSHELL_DATA_LOC
async def get_config(self) -> MinerConfig:
return MinerConfig().from_raw(await self.web.pools())
# get pool data
try:
pools = await self.web.pools()
except APIError:
return self.config
self.config = MinerConfig.from_goldshell(pools)
return self.config
async def send_config(self, config: MinerConfig, user_suffix: str = None) -> None:
pools_data = await self.web.pools()
@@ -80,7 +87,7 @@ class BFGMinerGoldshell(BFGMiner):
self.config = config
# send them back 1 at a time
for pool in config.as_goldshell(user_suffix=user_suffix):
for pool in config.as_goldshell(user_suffix=user_suffix)["pools"]:
await self.web.newpool(
url=pool["url"], user=pool["user"], password=pool["pass"]
)

View File

@@ -104,7 +104,7 @@ class BMMiner(BaseMiner):
except APIError:
return self.config
self.config = MinerConfig().from_api(pools["POOLS"])
self.config = MinerConfig.from_api(pools)
return self.config
async def reboot(self) -> bool:

View File

@@ -15,6 +15,7 @@
# ------------------------------------------------------------------------------
import asyncio
import logging
import time
from collections import namedtuple
from typing import List, Optional, Tuple, Union
@@ -22,6 +23,7 @@ import toml
from pyasic.API.bosminer import BOSMinerAPI
from pyasic.config import MinerConfig
from pyasic.config.mining import MiningModePowerTune
from pyasic.data import Fan, HashBoard
from pyasic.data.error_codes import BraiinsOSError, MinerErrorData
from pyasic.errors import APIError
@@ -297,11 +299,6 @@ class BOSMiner(BaseMiner):
return False
async def get_config(self) -> MinerConfig:
"""Gets the config for the miner and sets it as `self.config`.
Returns:
The config from `self.config`.
"""
logging.debug(f"{self}: Getting config.")
try:
@@ -316,21 +313,42 @@ class BOSMiner(BaseMiner):
(await conn.run("cat /etc/bosminer.toml")).stdout
)
logging.debug(f"{self}: Converting config file.")
cfg = MinerConfig().from_raw(toml_data)
cfg = MinerConfig.from_bosminer(toml_data)
self.config = cfg
return self.config
async def send_config(self, config: MinerConfig, user_suffix: str = None) -> None:
"""Configures miner with yaml config."""
logging.debug(f"{self}: Sending config.")
self.config = config
toml_conf = config.as_bos(
model=self.model.replace(" (BOS)", ""), user_suffix=user_suffix
if self.web.grpc is not None:
try:
await self._send_config_grpc(config, user_suffix)
return
except:
pass
await self._send_config_bosminer(config, user_suffix)
async def _send_config_grpc(self, config: MinerConfig, user_suffix: str = None):
raise NotImplementedError
mining_mode = config.mining_mode
async def _send_config_bosminer(self, config: MinerConfig, user_suffix: str = None):
toml_conf = toml.dumps(
{
"format": {
"version": "1.2+",
"generator": "pyasic",
"model": f"{self.make.replace('Miner', 'miner')} {self.model.replace(' (BOS)', '').replace('j', 'J')}",
"timestamp": int(time.time()),
},
**config.as_bosminer(user_suffix=user_suffix),
}
)
try:
conn = await self._get_ssh_connection()
except ConnectionError:
return None
except ConnectionError as e:
raise APIError("SSH connection failed when sending config.") from e
async with conn:
# BBB check because bitmain suxx
bbb_check = await conn.run(
@@ -362,7 +380,7 @@ class BOSMiner(BaseMiner):
cfg = await self.get_config()
if cfg is None:
return False
cfg.autotuning_wattage = wattage
cfg.mining_mode = MiningModePowerTune(wattage)
await self.send_config(cfg)
except Exception as e:
logging.warning(f"{self} set_power_limit: {e}")
@@ -539,7 +557,6 @@ class BOSMiner(BaseMiner):
async def get_hashrate(
self, api_summary: dict = None, graphql_hashrate: dict = None
) -> Optional[float]:
# get hr from graphql
if not graphql_hashrate:
try:
@@ -622,7 +639,7 @@ class BOSMiner(BaseMiner):
offset = 0
if 3 in b_names:
offset = 1
elif 6 in b_names:
elif 6 in b_names or 7 in b_names or 8 in b_names:
offset = 6
for hb in boards:
_id = int(hb["name"]) - offset

View File

@@ -15,12 +15,11 @@
# ------------------------------------------------------------------------------
import logging
import warnings
from collections import namedtuple
from typing import List, Optional, Tuple
from pyasic.API.btminer import BTMinerAPI
from pyasic.config import MinerConfig
from pyasic.config import MinerConfig, MiningModeConfig
from pyasic.data import Fan, HashBoard
from pyasic.data.error_codes import MinerErrorData, WhatsminerError
from pyasic.errors import APIError
@@ -198,44 +197,68 @@ class BTMiner(BaseMiner):
try:
await self.api.update_pools(**pools_conf)
if conf["mode"] == "normal":
await self.api.set_normal_power()
elif conf["mode"] == "high":
await self.api.set_high_power()
elif conf["mode"] == "low":
await self.api.set_low_power()
elif conf["mode"] == "power_tuning":
await self.api.adjust_power_limit(conf["power_tuning"]["wattage"])
except APIError:
pass
try:
await self.api.adjust_power_limit(conf["wattage"])
except APIError:
# cannot set wattage
# cannot update, no API access usually
pass
async def get_config(self) -> MinerConfig:
pools = None
summary = None
cfg = MinerConfig()
status = None
try:
data = await self.api.multicommand("pools", "summary")
data = await self.api.multicommand("pools", "summary", "status")
pools = data["pools"][0]
summary = data["summary"][0]
status = data["status"][0]
except APIError as e:
logging.warning(e)
except LookupError:
pass
if pools:
if "POOLS" in pools:
cfg = cfg.from_api(pools["POOLS"])
if pools is not None:
cfg = MinerConfig.from_api(pools)
else:
# somethings wrong with the miner
warnings.warn(
f"Failed to gather pool config for miner: {self}, miner did not return pool information."
)
if summary:
if "SUMMARY" in summary:
if wattage := summary["SUMMARY"][0].get("Power Limit"):
cfg.autotuning_wattage = wattage
cfg = MinerConfig()
self.config = cfg
is_mining = await self.is_mining(status)
if not is_mining:
cfg.mining_mode = MiningModeConfig.sleep()
return cfg
return self.config
if summary is not None:
mining_mode = None
try:
mining_mode = summary["SUMMARY"][0]["Power Mode"]
except LookupError:
pass
if mining_mode == "High":
cfg.mining_mode = MiningModeConfig.high()
return cfg
elif mining_mode == "Low":
cfg.mining_mode = MiningModeConfig.low()
return cfg
try:
power_lim = summary["SUMMARY"][0]["Power Limit"]
except LookupError:
power_lim = None
if power_lim is None:
cfg.mining_mode = MiningModeConfig.normal()
return cfg
cfg.mining_mode = MiningModeConfig.power_tuning(power_lim)
self.config = cfg
return self.config
async def set_power_limit(self, wattage: int) -> bool:
try:
@@ -386,7 +409,6 @@ class BTMiner(BaseMiner):
pass
async def get_hashboards(self, api_devs: dict = None) -> List[HashBoard]:
hashboards = [
HashBoard(slot=i, expected_chips=self.nominal_chips)
for i in range(self.ideal_hashboards)

View File

@@ -143,10 +143,13 @@ class CGMiner(BaseMiner):
return True
async def get_config(self) -> MinerConfig:
api_pools = await self.api.pools()
# get pool data
try:
pools = await self.api.pools()
except APIError:
return self.config
if api_pools:
self.config = MinerConfig().from_api(api_pools["POOLS"])
self.config = MinerConfig.from_api(pools)
return self.config
async def fault_light_off(self) -> bool:

View File

@@ -100,17 +100,17 @@ class CGMinerAvalon(CGMiner):
return False
async def send_config(self, config: MinerConfig, user_suffix: str = None) -> None:
"""Configures miner with yaml config."""
self.config = config
return None
logging.debug(f"{self}: Sending config.") # noqa - This doesnt work...
conf = config.as_avalon(user_suffix=user_suffix)
try:
data = await self.api.ascset( # noqa
0, "setpool", f"root,root,{conf}"
) # this should work but doesn't
except APIError:
pass
pass
# self.config = config
# return None
# logging.debug(f"{self}: Sending config.") # noqa - This doesnt work...
# conf = config.as_avalon(user_suffix=user_suffix)
# try:
# data = await self.api.ascset( # noqa
# 0, "setpool", f"root,root,{conf}"
# ) # this should work but doesn't
# except APIError:
# pass
# return data
@staticmethod

View File

@@ -14,16 +14,14 @@
# limitations under the License. -
# ------------------------------------------------------------------------------
from typing import Optional
from typing import List, Optional, Tuple, Union
from pyasic.data import Fan, HashBoard
from pyasic.data.error_codes import MinerErrorData, X19Error
from pyasic.errors import APIError
from pyasic.logger import logger
from pyasic.miners.backends.bmminer import BMMiner
from pyasic.web.epic import ePICWebAPI
from pyasic.data import Fan, HashBoard
from typing import List, Optional, Tuple, Union
from pyasic.data.error_codes import MinerErrorData, X19Error
EPIC_DATA_LOC = {
"mac": {"cmd": "get_mac", "kwargs": {"web_summary": {"web": "network"}}},
@@ -36,12 +34,21 @@ EPIC_DATA_LOC = {
"cmd": "get_nominal_hashrate",
"kwargs": {"web_summary": {"web": "summary"}},
},
"hashboards": {"cmd": "get_hashboards", "kwargs": {"web_summary": {"web": "summary"}, "web_hashrate": {"web": "hashrate"}}},
"hashboards": {
"cmd": "get_hashboards",
"kwargs": {
"web_summary": {"web": "summary"},
"web_hashrate": {"web": "hashrate"},
},
},
"env_temp": {"cmd": "get_env_temp", "kwargs": {}},
"wattage": {"cmd": "get_wattage", "kwargs": {"web_summary": {"web": "summary"}}},
"fans": {"cmd": "get_fans", "kwargs": {"web_summary": {"web": "summary"}}},
"fan_psu": {"cmd": "get_fan_psu", "kwargs": {}},
"fault_light": {"cmd": "get_fault_light", "kwargs": {"web_summary": {"web": "summary"}}},
"fault_light": {
"cmd": "get_fault_light",
"kwargs": {"web_summary": {"web": "summary"}},
},
"pools": {"cmd": "get_pools", "kwargs": {"web_summary": {"web": "summary"}}},
"is_mining": {"cmd": "is_mining", "kwargs": {}},
"uptime": {"cmd": "get_uptime", "kwargs": {"web_summary": {"web": "summary"}}},
@@ -148,12 +155,11 @@ class ePIC(BMMiner):
if web_summary["HBs"] != None:
for hb in web_summary["HBs"]:
hashrate += hb["Hashrate"][0]
return round(
float(float(hashrate/ 1000000)), 2)
return round(float(float(hashrate / 1000000)), 2)
except (LookupError, ValueError, TypeError) as e:
logger.error(e)
pass
async def get_nominal_hashrate(self, web_summary: dict = None) -> Optional[float]:
# get hr from API
if not web_summary:
@@ -170,16 +176,14 @@ class ePIC(BMMiner):
if hb["Hashrate"][1] == 0:
ideal = 1.0
else:
ideal = hb["Hashrate"][1]/100
hashrate += hb["Hashrate"][0]/ideal
return round(
float(float(hashrate/ 1000000)), 2)
ideal = hb["Hashrate"][1] / 100
hashrate += hb["Hashrate"][0] / ideal
return round(float(float(hashrate / 1000000)), 2)
except (IndexError, KeyError, ValueError, TypeError) as e:
logger.error(e)
pass
async def get_fw_ver(self, web_summary: dict = None) -> Optional[str]:
if not web_summary:
web_summary = await self.web.summary()
@@ -208,8 +212,10 @@ class ePIC(BMMiner):
except (LookupError, ValueError, TypeError):
fans.append(Fan())
return fans
async def get_hashboards(self, web_summary: dict = None, web_hashrate: dict= None) -> List[HashBoard]:
async def get_hashboards(
self, web_summary: dict = None, web_hashrate: dict = None
) -> List[HashBoard]:
if not web_summary:
try:
web_summary = await self.web.summary()
@@ -220,51 +226,53 @@ class ePIC(BMMiner):
web_hashrate = await self.web.hashrate()
except APIError:
pass
hb_list = [HashBoard(slot=i, expected_chips=self.nominal_chips) for i in range(self.ideal_hashboards)]
hb_list = [
HashBoard(slot=i, expected_chips=self.nominal_chips)
for i in range(self.ideal_hashboards)
]
if web_summary["HBs"] != None:
for hb in web_summary["HBs"]:
for hr in web_hashrate:
if hr["Index"] == hb["Index"]:
num_of_chips = len(hr["Data"])
hashrate = hb["Hashrate"][0]
#Update the Hashboard object
# Update the Hashboard object
hb_list[hr["Index"]].expected_chips = num_of_chips
hb_list[hr["Index"]].missing = False
hb_list[hr["Index"]].hashrate = round(hashrate/1000000,2)
hb_list[hr["Index"]].hashrate = round(hashrate / 1000000, 2)
hb_list[hr["Index"]].chips = num_of_chips
hb_list[hr["Index"]].temp = hb["Temperature"]
return hb_list
async def is_mining(self, *args, **kwargs) -> Optional[bool]:
return None
async def get_pools(self, web_summary: dict = None) -> List[dict]:
groups = []
groups = []
if not web_summary:
try:
web_summary = await self.api.summary()
except APIError:
pass
if not web_summary:
try:
web_summary = await self.api.summary()
except APIError:
pass
if web_summary:
try:
pools = {}
for i, pool in enumerate(web_summary["StratumConfigs"]):
pools[f"pool_{i + 1}_url"] = (
pool["pool"]
.replace("stratum+tcp://", "")
.replace("stratum2+tcp://", "")
)
pools[f"pool_{i + 1}_user"] = pool["login"]
pools["quota"] = pool["Quota"] if pool.get("Quota") else "0"
if web_summary:
try:
pools = {}
for i, pool in enumerate(web_summary["StratumConfigs"]):
pools[f"pool_{i + 1}_url"] = (
pool["pool"]
.replace("stratum+tcp://", "")
.replace("stratum2+tcp://", "")
)
pools[f"pool_{i + 1}_user"] = pool["login"]
pools["quota"] = pool["Quota"] if pool.get("Quota") else "0"
groups.append(pools)
except KeyError:
pass
return groups
groups.append(pools)
except KeyError:
pass
return groups
async def get_uptime(self, web_summary: dict = None) -> Optional[int]:
if not web_summary:
web_summary = await self.web.summary()
@@ -275,7 +283,7 @@ class ePIC(BMMiner):
except KeyError:
pass
return None
async def get_fault_light(self, web_summary: dict = None) -> bool:
if not web_summary:
web_summary = await self.web.summary()
@@ -286,7 +294,7 @@ class ePIC(BMMiner):
except KeyError:
pass
return False
async def get_errors(self, web_summary: dict = None) -> List[MinerErrorData]:
if not web_summary:
web_summary = await self.web.summary()

View File

@@ -196,15 +196,9 @@ class LUXMiner(BaseMiner):
return False
async def get_config(self) -> MinerConfig:
"""Gets the config for the miner and sets it as `self.config`.
Returns:
The config from `self.config`.
"""
return self.config
async def send_config(self, config: MinerConfig, user_suffix: str = None) -> None:
"""Configures miner with yaml config."""
pass
async def set_power_limit(self, wattage: int) -> bool:

View File

@@ -16,6 +16,12 @@
from pyasic.miners.backends.btminer import BTMiner
class M6X(BTMiner):
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.supports_autotuning = True
class M5X(BTMiner):
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)

View File

@@ -38,16 +38,13 @@ class CGMinerT3HPlus(CGMiner, T3HPlus):
return False
async def get_config(self, api_pools: dict = None) -> MinerConfig:
if not api_pools:
try:
api_pools = await self.api.pools()
except APIError as e:
logging.warning(e)
# get pool data
try:
pools = await self.api.pools()
except APIError:
return self.config
if api_pools:
if "POOLS" in api_pools.keys():
cfg = MinerConfig().from_api(api_pools["POOLS"])
self.config = cfg
self.config = MinerConfig.from_api(pools)
return self.config
async def reboot(self) -> bool:

View File

@@ -288,6 +288,25 @@ MINER_CLASSES = {
"M56SVH30": BTMinerM56SVH30,
"M56S+VJ30": BTMinerM56SPlusVJ30,
"M59VH30": BTMinerM59VH30,
"M60VK10": BTMinerM60VK10,
"M60VK20": BTMinerM60VK20,
"M60VK30": BTMinerM60VK30,
"M60VK40": BTMinerM60VK40,
"M60SVK10": BTMinerM60SVK10,
"M60SVK20": BTMinerM60SVK20,
"M60SVK30": BTMinerM60SVK30,
"M60SVK40": BTMinerM60SVK40,
"M63VK10": BTMinerM63VK10,
"M63VK20": BTMinerM63VK20,
"M63VK30": BTMinerM63VK30,
"M63SVK10": BTMinerM63SVK10,
"M63SVK20": BTMinerM63SVK20,
"M63SVK30": BTMinerM63SVK30,
"M66VK20": BTMinerM66VK20,
"M66VK30": BTMinerM66VK30,
"M66SVK20": BTMinerM66SVK20,
"M66SVK30": BTMinerM66SVK30,
"M66SVK40": BTMinerM66SVK40,
},
MinerTypes.AVALONMINER: {
None: CGMinerAvalon,
@@ -333,6 +352,7 @@ MINER_CLASSES = {
"ANTMINER S19J PRO": BOSMinerS19jPro,
"ANTMINER S19J PRO NOPIC": BOSMinerS19jPro,
"ANTMINER T19": BOSMinerT19,
"ANTMINER S19K PRO NOPIC": BOSMinerS19kProNoPIC,
},
MinerTypes.VNISH: {
None: VNish,
@@ -354,6 +374,8 @@ MINER_CLASSES = {
"ANTMINER S19 PRO": ePICS19Pro,
"ANTMINER S19J": ePICS19j,
"ANTMINER S19J PRO": ePICS19jPro,
"ANTMINER S19J PRO+": ePICS19jProPlus,
"ANTMINER S19K PRO": ePICS19kPro,
"ANTMINER S19 XP": ePICS19XP,
},
MinerTypes.HIVEON: {
@@ -368,24 +390,18 @@ MINER_CLASSES = {
async def concurrent_get_first_result(tasks: list, verification_func: Callable):
while True:
await asyncio.sleep(0)
if len(tasks) == 0:
return
for task in tasks:
if task.done():
try:
result = await task
except asyncio.CancelledError:
for t in tasks:
t.cancel()
raise
else:
if not verification_func(result):
continue
for t in tasks:
t.cancel()
return result
res = None
for fut in asyncio.as_completed(tasks):
res = await fut
if verification_func(res):
break
for t in tasks:
t.cancel()
try:
await t
except asyncio.CancelledError:
pass
return res
class MinerFactory:
@@ -432,7 +448,7 @@ class MinerFactory:
task, timeout=settings.get("factory_get_timeout", 3)
)
except asyncio.TimeoutError:
task.cancel()
continue
else:
if miner_type is not None:
break
@@ -460,7 +476,7 @@ class MinerFactory:
task, timeout=settings.get("factory_get_timeout", 3)
)
except asyncio.TimeoutError:
task.cancel()
pass
boser_enabled = None
if miner_type == MinerTypes.BRAIINS_OS:
@@ -486,19 +502,30 @@ class MinerFactory:
return await concurrent_get_first_result(tasks, lambda x: x is not None)
async def _get_miner_web(self, ip: str):
urls = [f"http://{ip}/", f"https://{ip}/"]
async with httpx.AsyncClient(
transport=settings.transport(verify=False)
) as session:
tasks = [asyncio.create_task(self._web_ping(session, url)) for url in urls]
tasks = []
try:
urls = [f"http://{ip}/", f"https://{ip}/"]
async with httpx.AsyncClient(
transport=settings.transport(verify=False)
) as session:
tasks = [
asyncio.create_task(self._web_ping(session, url)) for url in urls
]
text, resp = await concurrent_get_first_result(
tasks,
lambda x: x[0] is not None
and self._parse_web_type(x[0], x[1]) is not None,
)
if text is not None:
return self._parse_web_type(text, resp)
text, resp = await concurrent_get_first_result(
tasks,
lambda x: x[0] is not None
and self._parse_web_type(x[0], x[1]) is not None,
)
if text is not None:
return self._parse_web_type(text, resp)
except asyncio.CancelledError:
for t in tasks:
t.cancel()
try:
await t
except asyncio.CancelledError:
pass
@staticmethod
async def _web_ping(
@@ -544,15 +571,27 @@ class MinerFactory:
return MinerTypes.INNOSILICON
async def _get_miner_socket(self, ip: str):
commands = ["version", "devdetails"]
tasks = [asyncio.create_task(self._socket_ping(ip, cmd)) for cmd in commands]
tasks = []
try:
commands = ["version", "devdetails"]
tasks = [
asyncio.create_task(self._socket_ping(ip, cmd)) for cmd in commands
]
data = await concurrent_get_first_result(
tasks, lambda x: x is not None and self._parse_socket_type(x) is not None
)
if data is not None:
d = self._parse_socket_type(data)
return d
data = await concurrent_get_first_result(
tasks,
lambda x: x is not None and self._parse_socket_type(x) is not None,
)
if data is not None:
d = self._parse_socket_type(data)
return d
except asyncio.CancelledError:
for t in tasks:
t.cancel()
try:
await t
except asyncio.CancelledError:
pass
@staticmethod
async def _socket_ping(ip: str, cmd: str) -> Optional[str]:
@@ -890,7 +929,7 @@ class MinerFactory:
return miner_model
except (TypeError, LookupError):
pass
async def get_miner_model_epic(self, ip: str) -> Optional[str]:
sock_json_data = await self.send_web_command(ip, ":4028/capabilities")
try:

View File

@@ -125,6 +125,24 @@ class S19jPro(AntMiner): # noqa - ignore ABC method implementation
self.fan_count = 4
class S19jProPlus(AntMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "S19j Pro+"
self.nominal_chips = 120
self.fan_count = 4
class S19kPro(AntMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "S19k Pro"
self.nominal_chips = 77
self.fan_count = 4
class S19L(AntMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
@@ -132,3 +150,12 @@ class S19L(AntMiner): # noqa - ignore ABC method implementation
self.model = "S19L"
self.nominal_chips = 76
self.fan_count = 4
class S19kProNoPIC(AntMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "S19k Pro No PIC"
self.nominal_chips = 77
self.fan_count = 4

View File

@@ -24,6 +24,9 @@ from .S19 import (
S19j,
S19jNoPIC,
S19jPro,
S19jProPlus,
S19kPro,
S19kProNoPIC,
S19NoPIC,
S19Plus,
S19Pro,

View File

@@ -213,6 +213,7 @@ class M30SPlusVF30(WhatsMiner): # noqa - ignore ABC method implementation
self.nominal_chips = 117
self.fan_count = 2
class M30SPlusVG20(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
@@ -221,6 +222,7 @@ class M30SPlusVG20(WhatsMiner): # noqa - ignore ABC method implementation
self.nominal_chips = 82
self.fan_count = 2
class M30SPlusVG30(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)

View File

@@ -18,6 +18,7 @@ import warnings
from pyasic.miners.makes import WhatsMiner
class M31HV10(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
@@ -26,6 +27,7 @@ class M31HV10(WhatsMiner): # noqa - ignore ABC method implementation
self.nominal_chips = 114
self.fan_count = 0
class M31HV40(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)

View File

@@ -18,6 +18,7 @@ import warnings
from pyasic.miners.makes import WhatsMiner
class M31LV10(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)

View File

@@ -18,6 +18,7 @@ import warnings
from pyasic.miners.makes import WhatsMiner
class M33SPlusVG20(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
@@ -44,7 +45,7 @@ class M33SPlusVH30(WhatsMiner): # noqa - ignore ABC method implementation
self.ip = ip
self.model = "M33S+ VH30"
self.ideal_hashboards = 4
self.nominal_chips = 0 # slot1 116, slot2 106, slot3 116, slot4 106
self.nominal_chips = 0 # slot1 116, slot2 106, slot3 116, slot4 106
warnings.warn(
"Unknown chip count for miner type M30S+ VH30, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)

View File

@@ -18,6 +18,7 @@ import warnings
from pyasic.miners.makes import WhatsMiner
class M39V10(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)

View File

@@ -18,6 +18,7 @@ import warnings
from pyasic.miners.makes import WhatsMiner
class M50VE30(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
@@ -27,6 +28,7 @@ class M50VE30(WhatsMiner): # noqa - ignore ABC method implementation
self.nominal_chips = 255
self.fan_count = 2
class M50VG30(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)

View File

@@ -0,0 +1,67 @@
# ------------------------------------------------------------------------------
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
import warnings
from pyasic.miners.makes import WhatsMiner
class M60VK10(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M60 VK10"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M60 VK10, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 2
class M60VK20(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M60 VK20"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M60 VK20, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 2
class M60VK30(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M60 VK30"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M60 VK30, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 2
class M60VK40(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M60 VK40"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M60 VK40, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 2

View File

@@ -0,0 +1,65 @@
# ------------------------------------------------------------------------------
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
import warnings
from pyasic.miners.makes import WhatsMiner
class M60SVK10(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M60S VK10"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M60S VK10, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 2
class M60SVK20(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M60S VK20"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M60S VK20, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 2
class M60SVK30(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M60S VK30"
self.ideal_hashboards = 3
self.nominal_chips = 78
self.fan_count = 2
class M60SVK40(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M60S VK40"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M60S VK40, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 2

View File

@@ -0,0 +1,53 @@
# ------------------------------------------------------------------------------
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
import warnings
from pyasic.miners.makes import WhatsMiner
class M63VK10(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M63 VK10"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M63 VK10, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 0
class M63VK20(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M63 VK20"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M63 VK20, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 0
class M63VK30(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M63 VK30"
self.nominal_chips = 68
self.ideal_hashboards = 4
self.fan_count = 0

View File

@@ -0,0 +1,55 @@
# ------------------------------------------------------------------------------
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
import warnings
from pyasic.miners.makes import WhatsMiner
class M63SVK10(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M63S VK10"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M63S VK10, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 0
class M63SVK20(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M63S VK20"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M63S VK20, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 0
class M63SVK30(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M63S VK30"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M63S VK30, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 0

View File

@@ -0,0 +1,43 @@
# ------------------------------------------------------------------------------
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
import warnings
from pyasic.miners.makes import WhatsMiner
class M66VK20(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M66 VK20"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M66 VK20, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 0
class M66VK30(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M66 VK30"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M66 VK30, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 0

View File

@@ -0,0 +1,53 @@
# ------------------------------------------------------------------------------
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
import warnings
from pyasic.miners.makes import WhatsMiner
class M66SVK20(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M66S VK20"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M66S VK20, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 0
class M66SVK30(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M66S VK30"
self.nominal_chips = 96
self.ideal_hashboards = 4
self.fan_count = 0
class M66SVK40(WhatsMiner): # noqa - ignore ABC method implementation
def __init__(self, ip: str, api_ver: str = "0.0.0"):
super().__init__(ip, api_ver)
self.ip = ip
self.model = "M66S VK40"
self.nominal_chips = 0
warnings.warn(
"Unknown chip count for miner type M66 VK30, please open an issue on GitHub (https://github.com/UpstreamData/pyasic)."
)
self.fan_count = 0

View File

@@ -0,0 +1,22 @@
# ------------------------------------------------------------------------------
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from .M60 import M60VK10, M60VK20, M60VK30, M60VK40
from .M60S import M60SVK10, M60SVK20, M60SVK30, M60SVK40
from .M63 import M63VK10, M63VK20, M63VK30
from .M63S import M63SVK10, M63SVK20, M63SVK30
from .M66 import M66VK20, M66VK30
from .M66S import M66SVK20, M66SVK30, M66SVK40

View File

@@ -17,3 +17,4 @@
from .M2X import *
from .M3X import *
from .M5X import *
from .M6X import *

View File

@@ -0,0 +1,34 @@
# ------------------------------------------------------------------------------
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from pyasic.miners.backends import M6X
from pyasic.miners.types import M60VK10, M60VK20, M60VK30, M60VK40
class BTMinerM60VK10(M6X, M60VK10):
pass
class BTMinerM60VK20(M6X, M60VK20):
pass
class BTMinerM60VK30(M6X, M60VK30):
pass
class BTMinerM60VK40(M6X, M60VK40):
pass

View File

@@ -0,0 +1,34 @@
# ------------------------------------------------------------------------------
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from pyasic.miners.backends import M6X
from pyasic.miners.types import M60SVK10, M60SVK20, M60SVK30, M60SVK40
class BTMinerM60SVK10(M6X, M60SVK10):
pass
class BTMinerM60SVK20(M6X, M60SVK20):
pass
class BTMinerM60SVK30(M6X, M60SVK30):
pass
class BTMinerM60SVK40(M6X, M60SVK40):
pass

View File

@@ -1,5 +1,5 @@
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
@@ -13,11 +13,22 @@
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from enum import Enum
from pyasic.miners.backends import M6X
from pyasic.miners.types import (
M63VK10,
M63VK20,
M63VK30,
)
class SaveAction(Enum):
UNSPECIFIED = "SaveAction.SAVE_ACTION_UNSPECIFIED"
SAVE = "SaveAction.SAVE_ACTION_SAVE"
SAVE_AND_APPLY = "SaveAction.SAVE_ACTION_SAVE_AND_APPLY"
SAVE_AND_FORCE_APPLY = "SaveAction.SAVE_ACTION_SAVE_AND_FORCE_APPLY"
class BTMinerM63VK10(M6X, M63VK10):
pass
class BTMinerM63VK20(M6X, M63VK20):
pass
class BTMinerM63VK30(M6X, M63VK30):
pass

View File

@@ -0,0 +1,34 @@
# ------------------------------------------------------------------------------
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from pyasic.miners.backends import M6X
from pyasic.miners.types import (
M63SVK10,
M63SVK20,
M63SVK30,
)
class BTMinerM63SVK10(M6X, M63SVK10):
pass
class BTMinerM63SVK20(M6X, M63SVK20):
pass
class BTMinerM63SVK30(M6X, M63SVK30):
pass

View File

@@ -1,5 +1,5 @@
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
@@ -13,3 +13,17 @@
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from pyasic.miners.backends import M6X
from pyasic.miners.types import (
M66VK20,
M66VK30,
)
class BTMinerM66VK20(M6X, M66VK20):
pass
class BTMinerM66VK30(M6X, M66VK30):
pass

View File

@@ -1,5 +1,5 @@
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
@@ -13,3 +13,18 @@
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from pyasic.miners.backends import M6X
from pyasic.miners.types import M66SVK20, M66SVK30, M66SVK40
class BTMinerM66SVK20(M6X, M66SVK20):
pass
class BTMinerM66SVK30(M6X, M66SVK30):
pass
class BTMinerM66SVK40(M6X, M66SVK40):
pass

View File

@@ -0,0 +1,22 @@
# ------------------------------------------------------------------------------
# Copyright 2023 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from .M60 import BTMinerM60VK10, BTMinerM60VK20, BTMinerM60VK30, BTMinerM60VK40
from .M60S import BTMinerM60SVK10, BTMinerM60SVK20, BTMinerM60SVK30, BTMinerM60SVK40
from .M63 import BTMinerM63VK10, BTMinerM63VK20, BTMinerM63VK30
from .M63S import BTMinerM63SVK10, BTMinerM63SVK20, BTMinerM63SVK30
from .M66 import BTMinerM66VK20, BTMinerM66VK30
from .M66S import BTMinerM66SVK20, BTMinerM66SVK30, BTMinerM66SVK40

View File

@@ -17,3 +17,4 @@
from .M2X import *
from .M3X import *
from .M5X import *
from .M6X import *

View File

@@ -14,46 +14,19 @@
# limitations under the License. -
# ------------------------------------------------------------------------------
import json
from datetime import datetime, timedelta
from typing import List, Union
from datetime import timedelta
from typing import Union
import grpc_requests
import httpx
from google.protobuf.message import Message
from grpc import RpcError
from betterproto import Message
from grpclib.client import Channel
from pyasic import APIError, settings
from pyasic import settings
from pyasic.errors import APIError
from pyasic.web import BaseWebAPI
from pyasic.web.bosminer.proto import (
get_auth_service_descriptors,
get_service_descriptors,
)
from pyasic.web.bosminer.proto.bos.v1.actions_pb2 import ( # noqa: this will be defined
SetLocateDeviceStatusRequest,
)
from pyasic.web.bosminer.proto.bos.v1.authentication_pb2 import ( # noqa: this will be defined
SetPasswordRequest,
)
from pyasic.web.bosminer.proto.bos.v1.common_pb2 import ( # noqa: this will be defined
SaveAction,
)
from pyasic.web.bosminer.proto.bos.v1.cooling_pb2 import ( # noqa: this will be defined
SetImmersionModeRequest,
)
from pyasic.web.bosminer.proto.bos.v1.miner_pb2 import ( # noqa: this will be defined
DisableHashboardsRequest,
EnableHashboardsRequest,
)
from pyasic.web.bosminer.proto.bos.v1.performance_pb2 import ( # noqa: this will be defined
DecrementHashrateTargetRequest,
DecrementPowerTargetRequest,
IncrementHashrateTargetRequest,
IncrementPowerTargetRequest,
SetDefaultHashrateTargetRequest,
SetDefaultPowerTargetRequest,
SetHashrateTargetRequest,
SetPowerTargetRequest,
)
from .proto.braiins.bos import *
from .proto.braiins.bos.v1 import *
class BOSMinerWebAPI(BaseWebAPI):
@@ -286,6 +259,20 @@ class BOSMinerLuCIAPI:
return await self.send_command("/cgi-bin/luci/admin/miner/api_status")
class BOSMinerGRPCStub(
ApiVersionServiceStub,
AuthenticationServiceStub,
CoolingServiceStub,
ConfigurationServiceStub,
MinerServiceStub,
PoolServiceStub,
LicenseServiceStub,
ActionsServiceStub,
PerformanceServiceStub,
):
pass
class BOSMinerGRPCAPI:
def __init__(self, ip: str, pwd: str):
self.ip = ip
@@ -321,25 +308,16 @@ class BOSMinerGRPCAPI:
ignore_errors: bool = False,
auth: bool = True,
) -> dict:
service, method = command.split("/")
metadata = []
if auth:
metadata.append(("authorization", await self.auth()))
async with grpc_requests.StubAsyncClient(
f"{self.ip}:50051", service_descriptors=get_service_descriptors()
) as client:
await client.register_all_service()
try:
return await client.request(
service,
method,
request=message,
metadata=metadata,
)
except RpcError as e:
if ignore_errors:
return {}
raise APIError(e._details)
async with Channel(self.ip, 50051) as c:
endpoint = getattr(BOSMinerGRPCStub(c), command)
if endpoint is None:
if not ignore_errors:
raise APIError(f"Command not found - {endpoint}")
return {}
return (await endpoint(message, metadata=metadata)).to_pydict()
async def auth(self):
if self._auth is not None and self._auth_time - datetime.now() < timedelta(
@@ -350,100 +328,85 @@ class BOSMinerGRPCAPI:
return self._auth
async def _get_auth(self):
async with grpc_requests.StubAsyncClient(
f"{self.ip}:50051", service_descriptors=get_auth_service_descriptors()
) as client:
await client.register_all_service()
method_meta = client.get_method_meta(
"braiins.bos.v1.AuthenticationService", "Login"
)
_request = method_meta.method_type.request_parser(
{"username": self.username, "password": self.pwd},
method_meta.input_type,
)
metadata = await method_meta.handler(_request).initial_metadata()
for key, value in metadata:
if key == "authorization":
self._auth = value
async with Channel(self.ip, 50051) as c:
req = LoginRequest(username=self.username, password=self.pwd)
async with c.request(
"/braiins.bos.v1.AuthenticationService/Login",
grpclib.const.Cardinality.UNARY_UNARY,
type(req),
LoginResponse,
) as stream:
await stream.send_message(req, end=True)
await stream.recv_initial_metadata()
auth = stream.initial_metadata.get("authorization")
if auth is not None:
self._auth = auth
self._auth_time = datetime.now()
return self._auth
async def get_api_version(self):
return await self.send_command(
"braiins.bos.ApiVersionService/GetApiVersion", auth=False
"get_api_version", ApiVersionRequest(), auth=False
)
async def start(self):
return await self.send_command("braiins.bos.v1.ActionsService/Start")
return await self.send_command("start", StartRequest())
async def stop(self):
return await self.send_command("braiins.bos.v1.ActionsService/Stop")
return await self.send_command("stop", StopRequest())
async def pause_mining(self):
return await self.send_command("braiins.bos.v1.ActionsService/PauseMining")
return await self.send_command("pause_mining", PauseMiningRequest())
async def resume_mining(self):
return await self.send_command("braiins.bos.v1.ActionsService/ResumeMining")
return await self.send_command("resume_mining", ResumeMiningRequest())
async def restart(self):
return await self.send_command("braiins.bos.v1.ActionsService/Restart")
return await self.send_command("restart", RestartRequest())
async def reboot(self):
return await self.send_command("braiins.bos.v1.ActionsService/Reboot")
return await self.send_command("reboot", RebootRequest())
async def set_locate_device_status(self, enable: bool):
message = SetLocateDeviceStatusRequest()
message.enable = enable
return await self.send_command(
"braiins.bos.v1.ActionsService/SetLocateDeviceStatus", message=message
"set_locate_device_status", SetLocateDeviceStatusRequest(enable=enable)
)
async def get_locate_device_status(self):
return await self.send_command(
"braiins.bos.v1.ActionsService/GetLocateDeviceStatus"
)
return await self.send_command("get_locate_device_status")
async def set_password(self, password: str = None):
message = SetPasswordRequest()
if password:
message.password = password
return await self.send_command(
"braiins.bos.v1.AuthenticationService/SetPassword", message=message
"set_password", SetPasswordRequest(password=password)
)
async def get_cooling_state(self):
return await self.send_command("braiins.bos.v1.CoolingService/GetCoolingState")
return await self.send_command("get_cooling_state", GetCoolingStateRequest())
async def set_immersion_mode(
self,
enable: bool,
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
):
message = SetImmersionModeRequest()
message.enable = enable
message.save_action = save_action
return await self.send_command(
"braiins.bos.v1.CoolingService/SetImmersionMode", message=message
"set_immersion_mode",
SetImmersionModeRequest(
enable_immersion_mode=enable, save_action=save_action
),
)
async def get_tuner_state(self):
return await self.send_command(
"braiins.bos.v1.PerformanceService/GetTunerState"
)
return await self.send_command("get_tuner_state")
async def list_target_profiles(self):
return await self.send_command(
"braiins.bos.v1.PerformanceService/ListTargetProfiles"
)
return await self.send_command("list_target_profiles")
async def set_default_power_target(
self, save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY
):
message = SetDefaultPowerTargetRequest()
message.save_action = save_action
return await self.send_command(
"braiins.bos.v1.PerformanceService/SetDefaultPowerTarget", message=message
"set_default_power_target",
message=SetDefaultPowerTargetRequest(save_action=save_action),
)
async def set_power_target(
@@ -451,11 +414,11 @@ class BOSMinerGRPCAPI:
power_target: int,
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
):
message = SetPowerTargetRequest()
message.power_target.watt = power_target
message.save_action = save_action
return await self.send_command(
"braiins.bos.v1.PerformanceService/SetPowerTarget", message=message
"set_power_target",
SetPowerTargetRequest(
power_target=Power(watt=power_target), save_action=save_action
),
)
async def increment_power_target(
@@ -463,12 +426,12 @@ class BOSMinerGRPCAPI:
power_target_increment: int,
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
):
message = IncrementPowerTargetRequest()
message.power_target_increment.watt = power_target_increment
message.save_action = save_action
return await self.send_command(
"braiins.bos.v1.PerformanceService/IncrementPowerTarget", message=message
"increment_power_target",
message=IncrementPowerTargetRequest(
power_target_increment=Power(watt=power_target_increment),
save_action=save_action,
),
)
async def decrement_power_target(
@@ -476,37 +439,33 @@ class BOSMinerGRPCAPI:
power_target_decrement: int,
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
):
message = DecrementPowerTargetRequest()
message.power_target_decrement.watt = power_target_decrement
message.save_action = save_action
return await self.send_command(
"braiins.bos.v1.PerformanceService/DecrementPowerTarget",
message=message,
"decrement_power_target",
message=DecrementPowerTargetRequest(
power_target_decrement=Power(watt=power_target_decrement),
save_action=save_action,
),
)
async def set_default_hashrate_target(
self, save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY
):
message = SetDefaultHashrateTargetRequest()
message.save_action = save_action
return await self.send_command(
"braiins.bos.v1.PerformanceService/SetDefaultHashrateTarget",
message=message,
"set_default_hashrate_target",
message=SetDefaultHashrateTargetRequest(save_action=save_action),
)
async def set_hashrate_target(
self,
hashrate_target: int,
hashrate_target: float,
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
):
message = SetHashrateTargetRequest()
message.hashrate_target.terahash_per_second = hashrate_target
message.save_action = save_action
return await self.send_command(
"braiins.bos.v1.PerformanceService/SetHashrateTarget", message=message
"set_hashrate_target",
SetHashrateTargetRequest(
hashrate_target=TeraHashrate(terahash_per_second=hashrate_target),
save_action=save_action,
),
)
async def increment_hashrate_target(
@@ -514,15 +473,14 @@ class BOSMinerGRPCAPI:
hashrate_target_increment: int,
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
):
message = IncrementHashrateTargetRequest()
message.hashrate_target_increment.terahash_per_second = (
hashrate_target_increment
)
message.save_action = save_action
return await self.send_command(
"braiins.bos.v1.PerformanceService/IncrementHashrateTarget",
message=message,
"increment_hashrate_target",
IncrementHashrateTargetRequest(
hashrate_target_increment=TeraHashrate(
terahash_per_second=hashrate_target_increment
),
save_action=save_action,
),
)
async def decrement_hashrate_target(
@@ -530,18 +488,19 @@ class BOSMinerGRPCAPI:
hashrate_target_decrement: int,
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
):
message = DecrementHashrateTargetRequest()
message.hashrate_target_decrement.terahash_per_second = (
hashrate_target_decrement
)
message.save_action = save_action
return await self.send_command(
"braiins.bos.v1.PerformanceService/DecrementHashrateTarget",
message=message,
"decrement_hashrate_target",
DecrementHashrateTargetRequest(
hashrate_target_decrement=TeraHashrate(
terahash_per_second=hashrate_target_decrement
),
save_action=save_action,
),
)
async def set_dps(self):
async def set_dps(
self,
):
raise NotImplementedError
return await self.send_command("braiins.bos.v1.PerformanceService/SetDPS")
@@ -553,11 +512,11 @@ class BOSMinerGRPCAPI:
async def get_active_performance_mode(self):
return await self.send_command(
"braiins.bos.v1.PerformanceService/GetActivePerformanceMode"
"get_active_performance_mode", GetPerformanceModeRequest()
)
async def get_pool_groups(self):
return await self.send_command("braiins.bos.v1.PoolService/GetPoolGroups")
return await self.send_command("get_pool_groups", GetPoolGroupsRequest())
async def create_pool_group(self):
raise NotImplementedError
@@ -573,43 +532,42 @@ class BOSMinerGRPCAPI:
async def get_miner_configuration(self):
return await self.send_command(
"braiins.bos.v1.ConfigurationService/GetMinerConfiguration"
"get_miner_configuration", GetMinerConfigurationRequest()
)
async def get_constraints(self):
return await self.send_command(
"braiins.bos.v1.ConfigurationService/GetConstraints"
)
return await self.send_command("get_constraints", GetConstraintsRequest())
async def get_license_state(self):
return await self.send_command("braiins.bos.v1.LicenseService/GetLicenseState")
return await self.send_command("get_license_state", GetLicenseStateRequest())
async def get_miner_status(self):
return await self.send_command("braiins.bos.v1.MinerService/GetMinerStatus")
return await self.send_command("get_miner_status", GetMinerStatusRequest())
async def get_miner_details(self):
return await self.send_command("braiins.bos.v1.MinerService/GetMinerDetails")
return await self.send_command("get_miner_details", GetMinerDetailsRequest())
async def get_miner_stats(self):
return await self.send_command("braiins.bos.v1.MinerService/GetMinerStats")
return await self.send_command("get_miner_stats", GetMinerStatsRequest())
async def get_hashboards(self):
return await self.send_command("braiins.bos.v1.MinerService/GetHashboards")
return await self.send_command("get_hashboards", GetHashboardsRequest())
async def get_support_archive(self):
return await self.send_command("braiins.bos.v1.MinerService/GetSupportArchive")
return await self.send_command(
"get_support_archive", GetSupportArchiveRequest()
)
async def enable_hashboards(
self,
hashboard_ids: List[str],
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
):
message = EnableHashboardsRequest()
message.hashboard_ids[:] = hashboard_ids
message.save_action = save_action
return await self.send_command(
"braiins.bos.v1.MinerService/EnableHashboards", message=message
"enable_hashboards",
EnableHashboardsRequest(
hashboard_ids=hashboard_ids, save_action=save_action
),
)
async def disable_hashboards(
@@ -617,10 +575,9 @@ class BOSMinerGRPCAPI:
hashboard_ids: List[str],
save_action: SaveAction = SaveAction.SAVE_ACTION_SAVE_AND_APPLY,
):
message = DisableHashboardsRequest()
message.hashboard_ids[:] = hashboard_ids
message.save_action = save_action
return await self.send_command(
"braiins.bos.v1.MinerService/DisableHashboards", message=message
"disable_hashboards",
DisableHashboardsRequest(
hashboard_ids=hashboard_ids, save_action=save_action
),
)

View File

@@ -1,54 +0,0 @@
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from __future__ import annotations
from .bos import version_pb2
from .bos.v1 import (
actions_pb2,
authentication_pb2,
common_pb2,
configuration_pb2,
constraints_pb2,
cooling_pb2,
license_pb2,
miner_pb2,
performance_pb2,
pool_pb2,
units_pb2,
work_pb2,
)
def get_service_descriptors():
return [
*version_pb2.DESCRIPTOR.services_by_name.values(),
*authentication_pb2.DESCRIPTOR.services_by_name.values(),
*actions_pb2.DESCRIPTOR.services_by_name.values(),
*common_pb2.DESCRIPTOR.services_by_name.values(),
*configuration_pb2.DESCRIPTOR.services_by_name.values(),
*constraints_pb2.DESCRIPTOR.services_by_name.values(),
*cooling_pb2.DESCRIPTOR.services_by_name.values(),
*license_pb2.DESCRIPTOR.services_by_name.values(),
*miner_pb2.DESCRIPTOR.services_by_name.values(),
*performance_pb2.DESCRIPTOR.services_by_name.values(),
*pool_pb2.DESCRIPTOR.services_by_name.values(),
*units_pb2.DESCRIPTOR.services_by_name.values(),
*work_pb2.DESCRIPTOR.services_by_name.values(),
]
def get_auth_service_descriptors():
return authentication_pb2.DESCRIPTOR.services_by_name.values()

View File

@@ -1,56 +0,0 @@
# -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: bos/v1/actions.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
b'\n\x14\x62os/v1/actions.proto\x12\x0e\x62raiins.bos.v1"\x0e\n\x0cStartRequest"(\n\rStartResponse\x12\x17\n\x0f\x61lready_running\x18\x01 \x01(\x08"\x10\n\x0eRestartRequest"*\n\x0fRestartResponse\x12\x17\n\x0f\x61lready_running\x18\x01 \x01(\x08"\x0f\n\rRebootRequest"\x10\n\x0eRebootResponse"\r\n\x0bStopRequest"\'\n\x0cStopResponse\x12\x17\n\x0f\x61lready_stopped\x18\x01 \x01(\x08"\x14\n\x12PauseMiningRequest"-\n\x13PauseMiningResponse\x12\x16\n\x0e\x61lready_paused\x18\x01 \x01(\x08"\x15\n\x13ResumeMiningRequest".\n\x14ResumeMiningResponse\x12\x16\n\x0e\x61lready_mining\x18\x01 \x01(\x08".\n\x1cSetLocateDeviceStatusRequest\x12\x0e\n\x06\x65nable\x18\x01 \x01(\x08"-\n\x1aLocateDeviceStatusResponse\x12\x0f\n\x07\x65nabled\x18\x01 \x01(\x08"\x1e\n\x1cGetLocateDeviceStatusRequest2\xc7\x05\n\x0e\x41\x63tionsService\x12\x44\n\x05Start\x12\x1c.braiins.bos.v1.StartRequest\x1a\x1d.braiins.bos.v1.StartResponse\x12\x41\n\x04Stop\x12\x1b.braiins.bos.v1.StopRequest\x1a\x1c.braiins.bos.v1.StopResponse\x12V\n\x0bPauseMining\x12".braiins.bos.v1.PauseMiningRequest\x1a#.braiins.bos.v1.PauseMiningResponse\x12Y\n\x0cResumeMining\x12#.braiins.bos.v1.ResumeMiningRequest\x1a$.braiins.bos.v1.ResumeMiningResponse\x12J\n\x07Restart\x12\x1e.braiins.bos.v1.RestartRequest\x1a\x1f.braiins.bos.v1.RestartResponse\x12G\n\x06Reboot\x12\x1d.braiins.bos.v1.RebootRequest\x1a\x1e.braiins.bos.v1.RebootResponse\x12q\n\x15SetLocateDeviceStatus\x12,.braiins.bos.v1.SetLocateDeviceStatusRequest\x1a*.braiins.bos.v1.LocateDeviceStatusResponse\x12q\n\x15GetLocateDeviceStatus\x12,.braiins.bos.v1.GetLocateDeviceStatusRequest\x1a*.braiins.bos.v1.LocateDeviceStatusResponseb\x06proto3'
)
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.actions_pb2", _globals)
if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._options = None
_globals["_STARTREQUEST"]._serialized_start = 40
_globals["_STARTREQUEST"]._serialized_end = 54
_globals["_STARTRESPONSE"]._serialized_start = 56
_globals["_STARTRESPONSE"]._serialized_end = 96
_globals["_RESTARTREQUEST"]._serialized_start = 98
_globals["_RESTARTREQUEST"]._serialized_end = 114
_globals["_RESTARTRESPONSE"]._serialized_start = 116
_globals["_RESTARTRESPONSE"]._serialized_end = 158
_globals["_REBOOTREQUEST"]._serialized_start = 160
_globals["_REBOOTREQUEST"]._serialized_end = 175
_globals["_REBOOTRESPONSE"]._serialized_start = 177
_globals["_REBOOTRESPONSE"]._serialized_end = 193
_globals["_STOPREQUEST"]._serialized_start = 195
_globals["_STOPREQUEST"]._serialized_end = 208
_globals["_STOPRESPONSE"]._serialized_start = 210
_globals["_STOPRESPONSE"]._serialized_end = 249
_globals["_PAUSEMININGREQUEST"]._serialized_start = 251
_globals["_PAUSEMININGREQUEST"]._serialized_end = 271
_globals["_PAUSEMININGRESPONSE"]._serialized_start = 273
_globals["_PAUSEMININGRESPONSE"]._serialized_end = 318
_globals["_RESUMEMININGREQUEST"]._serialized_start = 320
_globals["_RESUMEMININGREQUEST"]._serialized_end = 341
_globals["_RESUMEMININGRESPONSE"]._serialized_start = 343
_globals["_RESUMEMININGRESPONSE"]._serialized_end = 389
_globals["_SETLOCATEDEVICESTATUSREQUEST"]._serialized_start = 391
_globals["_SETLOCATEDEVICESTATUSREQUEST"]._serialized_end = 437
_globals["_LOCATEDEVICESTATUSRESPONSE"]._serialized_start = 439
_globals["_LOCATEDEVICESTATUSRESPONSE"]._serialized_end = 484
_globals["_GETLOCATEDEVICESTATUSREQUEST"]._serialized_start = 486
_globals["_GETLOCATEDEVICESTATUSREQUEST"]._serialized_end = 516
_globals["_ACTIONSSERVICE"]._serialized_start = 519
_globals["_ACTIONSSERVICE"]._serialized_end = 1230
# @@protoc_insertion_point(module_scope)

View File

@@ -1,36 +0,0 @@
# -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: bos/v1/authentication.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
b'\n\x1b\x62os/v1/authentication.proto\x12\x0e\x62raiins.bos.v1"2\n\x0cLoginRequest\x12\x10\n\x08username\x18\x01 \x01(\t\x12\x10\n\x08password\x18\x02 \x01(\t"\x0f\n\rLoginResponse"8\n\x12SetPasswordRequest\x12\x15\n\x08password\x18\x01 \x01(\tH\x00\x88\x01\x01\x42\x0b\n\t_password"\x15\n\x13SetPasswordResponse2\xb5\x01\n\x15\x41uthenticationService\x12\x44\n\x05Login\x12\x1c.braiins.bos.v1.LoginRequest\x1a\x1d.braiins.bos.v1.LoginResponse\x12V\n\x0bSetPassword\x12".braiins.bos.v1.SetPasswordRequest\x1a#.braiins.bos.v1.SetPasswordResponseb\x06proto3'
)
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(
DESCRIPTOR, "bos.v1.authentication_pb2", _globals
)
if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._options = None
_globals["_LOGINREQUEST"]._serialized_start = 47
_globals["_LOGINREQUEST"]._serialized_end = 97
_globals["_LOGINRESPONSE"]._serialized_start = 99
_globals["_LOGINRESPONSE"]._serialized_end = 114
_globals["_SETPASSWORDREQUEST"]._serialized_start = 116
_globals["_SETPASSWORDREQUEST"]._serialized_end = 172
_globals["_SETPASSWORDRESPONSE"]._serialized_start = 174
_globals["_SETPASSWORDRESPONSE"]._serialized_end = 195
_globals["_AUTHENTICATIONSERVICE"]._serialized_start = 198
_globals["_AUTHENTICATIONSERVICE"]._serialized_end = 379
# @@protoc_insertion_point(module_scope)

View File

@@ -1,26 +0,0 @@
# -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: bos/v1/common.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
b"\n\x13\x62os/v1/common.proto\x12\x0e\x62raiins.bos.v1*\x85\x01\n\nSaveAction\x12\x1b\n\x17SAVE_ACTION_UNSPECIFIED\x10\x00\x12\x14\n\x10SAVE_ACTION_SAVE\x10\x01\x12\x1e\n\x1aSAVE_ACTION_SAVE_AND_APPLY\x10\x02\x12$\n SAVE_ACTION_SAVE_AND_FORCE_APPLY\x10\x03\x62\x06proto3"
)
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.common_pb2", _globals)
if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._options = None
_globals["_SAVEACTION"]._serialized_start = 40
_globals["_SAVEACTION"]._serialized_end = 173
# @@protoc_insertion_point(module_scope)

View File

@@ -1,40 +0,0 @@
# -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: bos/v1/configuration.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from ...bos.v1 import cooling_pb2 as bos_dot_v1_dot_cooling__pb2
from ...bos.v1 import performance_pb2 as bos_dot_v1_dot_performance__pb2
from ...bos.v1 import pool_pb2 as bos_dot_v1_dot_pool__pb2
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
b'\n\x1a\x62os/v1/configuration.proto\x12\x0e\x62raiins.bos.v1\x1a\x14\x62os/v1/cooling.proto\x1a\x18\x62os/v1/performance.proto\x1a\x11\x62os/v1/pool.proto"\x1e\n\x1cGetMinerConfigurationRequest"\xc6\x02\n\x1dGetMinerConfigurationResponse\x12;\n\x0bpool_groups\x18\x01 \x03(\x0b\x32&.braiins.bos.v1.PoolGroupConfiguration\x12\x39\n\x0btemperature\x18\x02 \x01(\x0b\x32$.braiins.bos.v1.CoolingConfiguration\x12\x31\n\x05tuner\x18\x03 \x01(\x0b\x32".braiins.bos.v1.TunerConfiguration\x12-\n\x03\x64ps\x18\x04 \x01(\x0b\x32 .braiins.bos.v1.DPSConfiguration\x12K\n\x10hashboard_config\x18\x05 \x01(\x0b\x32\x31.braiins.bos.v1.HashboardPerformanceConfiguration"\x17\n\x15GetConstraintsRequest"\x95\x02\n\x16GetConstraintsResponse\x12;\n\x11tuner_constraints\x18\x01 \x01(\x0b\x32 .braiins.bos.v1.TunerConstraints\x12?\n\x13\x63ooling_constraints\x18\x02 \x01(\x0b\x32".braiins.bos.v1.CoolingConstraints\x12\x37\n\x0f\x64ps_constraints\x18\x03 \x01(\x0b\x32\x1e.braiins.bos.v1.DPSConstraints\x12\x44\n\x16hashboards_constraints\x18\x04 \x01(\x0b\x32$.braiins.bos.v1.HashboardConstraints2\xed\x01\n\x14\x43onfigurationService\x12t\n\x15GetMinerConfiguration\x12,.braiins.bos.v1.GetMinerConfigurationRequest\x1a-.braiins.bos.v1.GetMinerConfigurationResponse\x12_\n\x0eGetConstraints\x12%.braiins.bos.v1.GetConstraintsRequest\x1a&.braiins.bos.v1.GetConstraintsResponseb\x06proto3'
)
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(
DESCRIPTOR, "bos.v1.configuration_pb2", _globals
)
if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._options = None
_globals["_GETMINERCONFIGURATIONREQUEST"]._serialized_start = 113
_globals["_GETMINERCONFIGURATIONREQUEST"]._serialized_end = 143
_globals["_GETMINERCONFIGURATIONRESPONSE"]._serialized_start = 146
_globals["_GETMINERCONFIGURATIONRESPONSE"]._serialized_end = 472
_globals["_GETCONSTRAINTSREQUEST"]._serialized_start = 474
_globals["_GETCONSTRAINTSREQUEST"]._serialized_end = 497
_globals["_GETCONSTRAINTSRESPONSE"]._serialized_start = 500
_globals["_GETCONSTRAINTSRESPONSE"]._serialized_end = 777
_globals["_CONFIGURATIONSERVICE"]._serialized_start = 780
_globals["_CONFIGURATIONSERVICE"]._serialized_end = 1017
# @@protoc_insertion_point(module_scope)

View File

@@ -1,44 +0,0 @@
# -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: bos/v1/constraints.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from ...bos.v1 import units_pb2 as bos_dot_v1_dot_units__pb2
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
b'\n\x18\x62os/v1/constraints.proto\x12\x0e\x62raiins.bos.v1\x1a\x12\x62os/v1/units.proto">\n\x11UInt32Constraints\x12\x0f\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\r\x12\x0b\n\x03min\x18\x02 \x01(\r\x12\x0b\n\x03max\x18\x03 \x01(\r">\n\x11\x44oubleConstraints\x12\x0f\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x01\x12\x0b\n\x03min\x18\x02 \x01(\x01\x12\x0b\n\x03max\x18\x03 \x01(\x01"\x82\x01\n\x10PowerConstraints\x12&\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x0b\x32\x15.braiins.bos.v1.Power\x12"\n\x03min\x18\x02 \x01(\x0b\x32\x15.braiins.bos.v1.Power\x12"\n\x03max\x18\x03 \x01(\x0b\x32\x15.braiins.bos.v1.Power"\x9a\x01\n\x13HashrateConstraints\x12-\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x0b\x32\x1c.braiins.bos.v1.TeraHashrate\x12)\n\x03min\x18\x02 \x01(\x0b\x32\x1c.braiins.bos.v1.TeraHashrate\x12)\n\x03max\x18\x03 \x01(\x0b\x32\x1c.braiins.bos.v1.TeraHashrate"\x9a\x01\n\x16TemperatureConstraints\x12,\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature\x12(\n\x03min\x18\x02 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature\x12(\n\x03max\x18\x03 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature"$\n\x11\x42ooleanConstraint\x12\x0f\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x08"\x85\x01\n\x13\x44urationConstraints\x12&\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x0b\x32\x15.braiins.bos.v1.Hours\x12"\n\x03min\x18\x02 \x01(\x0b\x32\x15.braiins.bos.v1.Hours\x12"\n\x03max\x18\x03 \x01(\x0b\x32\x15.braiins.bos.v1.Hours"\x92\x01\n\x14\x46requencyConstraints\x12*\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x0b\x32\x19.braiins.bos.v1.Frequency\x12&\n\x03min\x18\x02 \x01(\x0b\x32\x19.braiins.bos.v1.Frequency\x12&\n\x03max\x18\x03 \x01(\x0b\x32\x19.braiins.bos.v1.Frequency"\x8a\x01\n\x12VoltageConstraints\x12(\n\x07\x64\x65\x66\x61ult\x18\x01 \x01(\x0b\x32\x17.braiins.bos.v1.Voltage\x12$\n\x03min\x18\x02 \x01(\x0b\x32\x17.braiins.bos.v1.Voltage\x12$\n\x03max\x18\x03 \x01(\x0b\x32\x17.braiins.bos.v1.Voltageb\x06proto3'
)
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.constraints_pb2", _globals)
if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._options = None
_globals["_UINT32CONSTRAINTS"]._serialized_start = 64
_globals["_UINT32CONSTRAINTS"]._serialized_end = 126
_globals["_DOUBLECONSTRAINTS"]._serialized_start = 128
_globals["_DOUBLECONSTRAINTS"]._serialized_end = 190
_globals["_POWERCONSTRAINTS"]._serialized_start = 193
_globals["_POWERCONSTRAINTS"]._serialized_end = 323
_globals["_HASHRATECONSTRAINTS"]._serialized_start = 326
_globals["_HASHRATECONSTRAINTS"]._serialized_end = 480
_globals["_TEMPERATURECONSTRAINTS"]._serialized_start = 483
_globals["_TEMPERATURECONSTRAINTS"]._serialized_end = 637
_globals["_BOOLEANCONSTRAINT"]._serialized_start = 639
_globals["_BOOLEANCONSTRAINT"]._serialized_end = 675
_globals["_DURATIONCONSTRAINTS"]._serialized_start = 678
_globals["_DURATIONCONSTRAINTS"]._serialized_end = 811
_globals["_FREQUENCYCONSTRAINTS"]._serialized_start = 814
_globals["_FREQUENCYCONSTRAINTS"]._serialized_end = 960
_globals["_VOLTAGECONSTRAINTS"]._serialized_start = 963
_globals["_VOLTAGECONSTRAINTS"]._serialized_end = 1101
# @@protoc_insertion_point(module_scope)

View File

@@ -1,56 +0,0 @@
# -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: bos/v1/cooling.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from ...bos.v1 import common_pb2 as bos_dot_v1_dot_common__pb2
from ...bos.v1 import constraints_pb2 as bos_dot_v1_dot_constraints__pb2
from ...bos.v1 import units_pb2 as bos_dot_v1_dot_units__pb2
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
b'\n\x14\x62os/v1/cooling.proto\x12\x0e\x62raiins.bos.v1\x1a\x13\x62os/v1/common.proto\x1a\x18\x62os/v1/constraints.proto\x1a\x12\x62os/v1/units.proto"\xbc\x01\n\x0f\x43oolingAutoMode\x12\x37\n\x12target_temperature\x18\x01 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature\x12\x34\n\x0fhot_temperature\x18\x02 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature\x12:\n\x15\x64\x61ngerous_temperature\x18\x03 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature"\xb7\x01\n\x11\x43oolingManualMode\x12\x1c\n\x0f\x66\x61n_speed_ratio\x18\x01 \x01(\x01H\x00\x88\x01\x01\x12\x34\n\x0fhot_temperature\x18\x02 \x01(\x0b\x32\x1b.braiins.bos.v1.Temperature\x12:\n\x15\x64\x61ngerous_temperature\x18\x03 \x01(\x0b\x32\x1b.braiins.bos.v1.TemperatureB\x12\n\x10_fan_speed_ratio"G\n\x13\x43oolingDisabledMode\x12\x1c\n\x0f\x66\x61n_speed_ratio\x18\x01 \x01(\x01H\x00\x88\x01\x01\x42\x12\n\x10_fan_speed_ratio"\xfb\x01\n\x14\x43oolingConfiguration\x12"\n\x15minimum_required_fans\x18\x01 \x01(\rH\x01\x88\x01\x01\x12/\n\x04\x61uto\x18\x02 \x01(\x0b\x32\x1f.braiins.bos.v1.CoolingAutoModeH\x00\x12\x33\n\x06manual\x18\x03 \x01(\x0b\x32!.braiins.bos.v1.CoolingManualModeH\x00\x12\x37\n\x08\x64isabled\x18\x04 \x01(\x0b\x32#.braiins.bos.v1.CoolingDisabledModeH\x00\x42\x06\n\x04modeB\x18\n\x16_minimum_required_fans"\x99\x03\n\x12\x43oolingConstraints\x12\x39\n\x14\x64\x65\x66\x61ult_cooling_mode\x18\x01 \x01(\x0e\x32\x1b.braiins.bos.v1.CoolingMode\x12\x42\n\x12target_temperature\x18\x02 \x01(\x0b\x32&.braiins.bos.v1.TemperatureConstraints\x12?\n\x0fhot_temperature\x18\x03 \x01(\x0b\x32&.braiins.bos.v1.TemperatureConstraints\x12\x45\n\x15\x64\x61ngerous_temperature\x18\x04 \x01(\x0b\x32&.braiins.bos.v1.TemperatureConstraints\x12:\n\x0f\x66\x61n_speed_ratio\x18\x05 \x01(\x0b\x32!.braiins.bos.v1.DoubleConstraints\x12@\n\x15minimum_required_fans\x18\x06 \x01(\x0b\x32!.braiins.bos.v1.UInt32Constraints"s\n\x08\x46\x61nState\x12\x15\n\x08position\x18\x01 \x01(\rH\x00\x88\x01\x01\x12\x0b\n\x03rpm\x18\x02 \x01(\r\x12\x1f\n\x12target_speed_ratio\x18\x03 \x01(\x01H\x01\x88\x01\x01\x42\x0b\n\t_positionB\x15\n\x13_target_speed_ratio"\x8f\x01\n\x11TemperatureSensor\x12\x0f\n\x02id\x18\x01 \x01(\rH\x00\x88\x01\x01\x12\x30\n\x08location\x18\x02 \x01(\x0e\x32\x1e.braiins.bos.v1.SensorLocation\x12\x30\n\x0btemperature\x18\x03 \x01(\x0b\x32\x1b.braiins.bos.v1.TemperatureB\x05\n\x03_id"\x18\n\x16GetCoolingStateRequest"\x81\x01\n\x17GetCoolingStateResponse\x12&\n\x04\x66\x61ns\x18\x01 \x03(\x0b\x32\x18.braiins.bos.v1.FanState\x12>\n\x13highest_temperature\x18\x02 \x01(\x0b\x32!.braiins.bos.v1.TemperatureSensor"i\n\x17SetImmersionModeRequest\x12/\n\x0bsave_action\x18\x01 \x01(\x0e\x32\x1a.braiins.bos.v1.SaveAction\x12\x1d\n\x15\x65nable_immersion_mode\x18\x02 \x01(\x08"2\n\x18SetImmersionModeResponse\x12\x16\n\x0eimmersion_mode\x18\x01 \x01(\x08*v\n\x0b\x43oolingMode\x12\x1c\n\x18\x43OOLING_MODE_UNSPECIFIED\x10\x00\x12\x15\n\x11\x43OOLING_MODE_AUTO\x10\x01\x12\x17\n\x13\x43OOLING_MODE_MANUAL\x10\x02\x12\x19\n\x15\x43OOLING_MODE_DISABLED\x10\x03*d\n\x0eSensorLocation\x12\x1f\n\x1bSENSOR_LOCATION_UNSPECIFIED\x10\x00\x12\x18\n\x14SENSOR_LOCATION_CHIP\x10\x01\x12\x17\n\x13SENSOR_LOCATION_PCB\x10\x02\x32\xdb\x01\n\x0e\x43oolingService\x12\x62\n\x0fGetCoolingState\x12&.braiins.bos.v1.GetCoolingStateRequest\x1a\'.braiins.bos.v1.GetCoolingStateResponse\x12\x65\n\x10SetImmersionMode\x12\'.braiins.bos.v1.SetImmersionModeRequest\x1a(.braiins.bos.v1.SetImmersionModeResponseb\x06proto3'
)
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.cooling_pb2", _globals)
if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._options = None
_globals["_COOLINGMODE"]._serialized_start = 1803
_globals["_COOLINGMODE"]._serialized_end = 1921
_globals["_SENSORLOCATION"]._serialized_start = 1923
_globals["_SENSORLOCATION"]._serialized_end = 2023
_globals["_COOLINGAUTOMODE"]._serialized_start = 108
_globals["_COOLINGAUTOMODE"]._serialized_end = 296
_globals["_COOLINGMANUALMODE"]._serialized_start = 299
_globals["_COOLINGMANUALMODE"]._serialized_end = 482
_globals["_COOLINGDISABLEDMODE"]._serialized_start = 484
_globals["_COOLINGDISABLEDMODE"]._serialized_end = 555
_globals["_COOLINGCONFIGURATION"]._serialized_start = 558
_globals["_COOLINGCONFIGURATION"]._serialized_end = 809
_globals["_COOLINGCONSTRAINTS"]._serialized_start = 812
_globals["_COOLINGCONSTRAINTS"]._serialized_end = 1221
_globals["_FANSTATE"]._serialized_start = 1223
_globals["_FANSTATE"]._serialized_end = 1338
_globals["_TEMPERATURESENSOR"]._serialized_start = 1341
_globals["_TEMPERATURESENSOR"]._serialized_end = 1484
_globals["_GETCOOLINGSTATEREQUEST"]._serialized_start = 1486
_globals["_GETCOOLINGSTATEREQUEST"]._serialized_end = 1510
_globals["_GETCOOLINGSTATERESPONSE"]._serialized_start = 1513
_globals["_GETCOOLINGSTATERESPONSE"]._serialized_end = 1642
_globals["_SETIMMERSIONMODEREQUEST"]._serialized_start = 1644
_globals["_SETIMMERSIONMODEREQUEST"]._serialized_end = 1749
_globals["_SETIMMERSIONMODERESPONSE"]._serialized_start = 1751
_globals["_SETIMMERSIONMODERESPONSE"]._serialized_end = 1801
_globals["_COOLINGSERVICE"]._serialized_start = 2026
_globals["_COOLINGSERVICE"]._serialized_end = 2245
# @@protoc_insertion_point(module_scope)

View File

@@ -1,42 +0,0 @@
# -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: bos/v1/license.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from ...bos.v1 import units_pb2 as bos_dot_v1_dot_units__pb2
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
b'\n\x14\x62os/v1/license.proto\x12\x0e\x62raiins.bos.v1\x1a\x12\x62os/v1/units.proto")\n\x0bNoneLicense\x12\x1a\n\x12time_to_restricted\x18\x01 \x01(\r"\x10\n\x0eLimitedLicense"\x9a\x01\n\x0cValidLicense\x12)\n\x04type\x18\x01 \x01(\x0e\x32\x1b.braiins.bos.v1.LicenseType\x12\x15\n\rcontract_name\x18\x02 \x01(\t\x12\x1a\n\x12time_to_restricted\x18\x03 \x01(\r\x12,\n\x07\x64\x65v_fee\x18\x04 \x01(\x0b\x32\x1b.braiins.bos.v1.BasesPoints"\x80\x01\n\x0e\x45xpiredLicense\x12)\n\x04type\x18\x01 \x01(\x0e\x32\x1b.braiins.bos.v1.LicenseType\x12\x15\n\rcontract_name\x18\x02 \x01(\t\x12,\n\x07\x64\x65v_fee\x18\x03 \x01(\x0b\x32\x1b.braiins.bos.v1.BasesPoints"\x18\n\x16GetLicenseStateRequest"\xe4\x01\n\x17GetLicenseStateResponse\x12+\n\x04none\x18\x01 \x01(\x0b\x32\x1b.braiins.bos.v1.NoneLicenseH\x00\x12\x31\n\x07limited\x18\x02 \x01(\x0b\x32\x1e.braiins.bos.v1.LimitedLicenseH\x00\x12-\n\x05valid\x18\x03 \x01(\x0b\x32\x1c.braiins.bos.v1.ValidLicenseH\x00\x12\x31\n\x07\x65xpired\x18\x04 \x01(\x0b\x32\x1e.braiins.bos.v1.ExpiredLicenseH\x00\x42\x07\n\x05state*_\n\x0bLicenseType\x12\x1c\n\x18LICENSE_TYPE_UNSPECIFIED\x10\x00\x12\x19\n\x15LICENSE_TYPE_STANDARD\x10\x01\x12\x17\n\x13LICENSE_TYPE_CUSTOM\x10\x02\x32t\n\x0eLicenseService\x12\x62\n\x0fGetLicenseState\x12&.braiins.bos.v1.GetLicenseStateRequest\x1a\'.braiins.bos.v1.GetLicenseStateResponseb\x06proto3'
)
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.license_pb2", _globals)
if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._options = None
_globals["_LICENSETYPE"]._serialized_start = 666
_globals["_LICENSETYPE"]._serialized_end = 761
_globals["_NONELICENSE"]._serialized_start = 60
_globals["_NONELICENSE"]._serialized_end = 101
_globals["_LIMITEDLICENSE"]._serialized_start = 103
_globals["_LIMITEDLICENSE"]._serialized_end = 119
_globals["_VALIDLICENSE"]._serialized_start = 122
_globals["_VALIDLICENSE"]._serialized_end = 276
_globals["_EXPIREDLICENSE"]._serialized_start = 279
_globals["_EXPIREDLICENSE"]._serialized_end = 407
_globals["_GETLICENSESTATEREQUEST"]._serialized_start = 409
_globals["_GETLICENSESTATEREQUEST"]._serialized_end = 433
_globals["_GETLICENSESTATERESPONSE"]._serialized_start = 436
_globals["_GETLICENSESTATERESPONSE"]._serialized_end = 664
_globals["_LICENSESERVICE"]._serialized_start = 763
_globals["_LICENSESERVICE"]._serialized_end = 879
# @@protoc_insertion_point(module_scope)

File diff suppressed because one or more lines are too long

View File

@@ -1,110 +0,0 @@
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
from dataclasses import asdict, dataclass
from typing import Union
@dataclass
class Frequency:
hertz: float
@dataclass
class Voltage:
volt: float
@dataclass
class Power:
watt: int
@dataclass
class TeraHashrate:
terahash_per_second: float
@dataclass
class HashboardPerformanceSettings:
id: str
frequency: Frequency
voltage: Voltage
@dataclass
class ManualPerformanceMode:
global_frequency: Frequency
global_voltage: Voltage
hashboards: list[HashboardPerformanceSettings]
@dataclass
class PowerTargetMode:
power_target: Power
@dataclass
class HashrateTargetMode:
hashrate_target: TeraHashrate
@dataclass
class TunerPerformanceMode:
target: Union[PowerTargetMode, HashrateTargetMode]
@dataclass
class PerformanceMode:
mode: Union[ManualPerformanceMode, TunerPerformanceMode]
@classmethod
def create(
cls,
power_target: int = None,
hashrate_target: float = None,
manual_configuration: ManualPerformanceMode = None,
):
provided_args = [power_target, hashrate_target, manual_configuration]
if sum(arg is not None for arg in provided_args) > 1:
raise ValueError(
"More than one keyword argument provided. Please use only power target, hashrate target, or manual config."
)
elif sum(arg is not None for arg in provided_args) < 1:
raise ValueError(
"Please pass one of power target, hashrate target, or manual config."
)
if power_target is not None:
return cls(
mode=TunerPerformanceMode(
target=PowerTargetMode(power_target=Power(watt=power_target))
)
)
elif hashrate_target is not None:
return cls(
mode=TunerPerformanceMode(
target=HashrateTargetMode(
hashrate_target=TeraHashrate(
terahash_per_second=hashrate_target
)
)
)
)
elif manual_configuration is not None:
return cls(mode=manual_configuration)
def as_dict(self):
return asdict(self)

File diff suppressed because one or more lines are too long

View File

@@ -1,75 +0,0 @@
# -*- coding: utf-8 -*-
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: bos/v1/pool.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from ...bos.v1 import common_pb2 as bos_dot_v1_dot_common__pb2
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
b'\n\x11\x62os/v1/pool.proto\x12\x0e\x62raiins.bos.v1\x1a\x13\x62os/v1/common.proto"\x16\n\x05Quota\x12\r\n\x05value\x18\x01 \x01(\r" \n\x0f\x46ixedShareRatio\x12\r\n\x05value\x18\x01 \x01(\x01"\xe4\x01\n\x16PoolGroupConfiguration\x12\x0b\n\x03uid\x18\x01 \x01(\t\x12\x0c\n\x04name\x18\x02 \x01(\t\x12&\n\x05quota\x18\x03 \x01(\x0b\x32\x15.braiins.bos.v1.QuotaH\x00\x12<\n\x11\x66ixed_share_ratio\x18\x04 \x01(\x0b\x32\x1f.braiins.bos.v1.FixedShareRatioH\x00\x12\x30\n\x05pools\x18\x05 \x03(\x0b\x32!.braiins.bos.v1.PoolConfigurationB\x17\n\x15load_balance_strategy"\x81\x01\n\x11PoolConfiguration\x12\x0b\n\x03uid\x18\x01 \x01(\t\x12\x0b\n\x03url\x18\x02 \x01(\t\x12\x0c\n\x04user\x18\x03 \x01(\t\x12\x15\n\x08password\x18\x04 \x01(\tH\x00\x88\x01\x01\x12\x14\n\x07\x65nabled\x18\x05 \x01(\x08H\x01\x88\x01\x01\x42\x0b\n\t_passwordB\n\n\x08_enabled"\xb0\x01\n\tPoolGroup\x12\x0c\n\x04name\x18\x01 \x01(\t\x12&\n\x05quota\x18\x02 \x01(\x0b\x32\x15.braiins.bos.v1.QuotaH\x00\x12<\n\x11\x66ixed_share_ratio\x18\x03 \x01(\x0b\x32\x1f.braiins.bos.v1.FixedShareRatioH\x00\x12#\n\x05pools\x18\x04 \x03(\x0b\x32\x14.braiins.bos.v1.PoolB\n\n\x08strategy"\x88\x01\n\x04Pool\x12\x0b\n\x03uid\x18\x01 \x01(\t\x12\x0b\n\x03url\x18\x02 \x01(\t\x12\x0c\n\x04user\x18\x03 \x01(\t\x12\x0f\n\x07\x65nabled\x18\x04 \x01(\x08\x12\r\n\x05\x61live\x18\x05 \x01(\x08\x12\x0e\n\x06\x61\x63tive\x18\x06 \x01(\x08\x12(\n\x05stats\x18\x07 \x01(\x0b\x32\x19.braiins.bos.v1.PoolStats"\x98\x01\n\tPoolStats\x12\x17\n\x0f\x61\x63\x63\x65pted_shares\x18\x01 \x01(\x04\x12\x17\n\x0frejected_shares\x18\x02 \x01(\x04\x12\x14\n\x0cstale_shares\x18\x03 \x01(\x04\x12\x17\n\x0flast_difficulty\x18\x04 \x01(\x04\x12\x12\n\nbest_share\x18\x05 \x01(\x04\x12\x16\n\x0egenerated_work\x18\x06 \x01(\x04"\x16\n\x14GetPoolGroupsRequest"G\n\x15GetPoolGroupsResponse\x12.\n\x0bpool_groups\x18\x01 \x03(\x0b\x32\x19.braiins.bos.v1.PoolGroup"\x80\x01\n\x16\x43reatePoolGroupRequest\x12/\n\x0bsave_action\x18\x01 \x01(\x0e\x32\x1a.braiins.bos.v1.SaveAction\x12\x35\n\x05group\x18\x02 \x01(\x0b\x32&.braiins.bos.v1.PoolGroupConfiguration"P\n\x17\x43reatePoolGroupResponse\x12\x35\n\x05group\x18\x01 \x01(\x0b\x32&.braiins.bos.v1.PoolGroupConfiguration"\x80\x01\n\x16UpdatePoolGroupRequest\x12/\n\x0bsave_action\x18\x01 \x01(\x0e\x32\x1a.braiins.bos.v1.SaveAction\x12\x35\n\x05group\x18\x02 \x01(\x0b\x32&.braiins.bos.v1.PoolGroupConfiguration"P\n\x17UpdatePoolGroupResponse\x12\x35\n\x05group\x18\x01 \x01(\x0b\x32&.braiins.bos.v1.PoolGroupConfiguration"V\n\x16RemovePoolGroupRequest\x12/\n\x0bsave_action\x18\x01 \x01(\x0e\x32\x1a.braiins.bos.v1.SaveAction\x12\x0b\n\x03uid\x18\x02 \x01(\t"\x19\n\x17RemovePoolGroupResponse2\x97\x03\n\x0bPoolService\x12\\\n\rGetPoolGroups\x12$.braiins.bos.v1.GetPoolGroupsRequest\x1a%.braiins.bos.v1.GetPoolGroupsResponse\x12\x62\n\x0f\x43reatePoolGroup\x12&.braiins.bos.v1.CreatePoolGroupRequest\x1a\'.braiins.bos.v1.CreatePoolGroupResponse\x12\x62\n\x0fUpdatePoolGroup\x12&.braiins.bos.v1.UpdatePoolGroupRequest\x1a\'.braiins.bos.v1.UpdatePoolGroupResponse\x12\x62\n\x0fRemovePoolGroup\x12&.braiins.bos.v1.RemovePoolGroupRequest\x1a\'.braiins.bos.v1.RemovePoolGroupResponseb\x06proto3'
)
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.pool_pb2", _globals)
if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._options = None
_globals["_QUOTA"]._serialized_start = 58
_globals["_QUOTA"]._serialized_end = 80
_globals["_FIXEDSHARERATIO"]._serialized_start = 82
_globals["_FIXEDSHARERATIO"]._serialized_end = 114
_globals["_POOLGROUPCONFIGURATION"]._serialized_start = 117
_globals["_POOLGROUPCONFIGURATION"]._serialized_end = 345
_globals["_POOLCONFIGURATION"]._serialized_start = 348
_globals["_POOLCONFIGURATION"]._serialized_end = 477
_globals["_POOLGROUP"]._serialized_start = 480
_globals["_POOLGROUP"]._serialized_end = 656
_globals["_POOL"]._serialized_start = 659
_globals["_POOL"]._serialized_end = 795
_globals["_POOLSTATS"]._serialized_start = 798
_globals["_POOLSTATS"]._serialized_end = 950
_globals["_GETPOOLGROUPSREQUEST"]._serialized_start = 952
_globals["_GETPOOLGROUPSREQUEST"]._serialized_end = 974
_globals["_GETPOOLGROUPSRESPONSE"]._serialized_start = 976
_globals["_GETPOOLGROUPSRESPONSE"]._serialized_end = 1047
_globals["_CREATEPOOLGROUPREQUEST"]._serialized_start = 1050
_globals["_CREATEPOOLGROUPREQUEST"]._serialized_end = 1178
_globals["_CREATEPOOLGROUPRESPONSE"]._serialized_start = 1180
_globals["_CREATEPOOLGROUPRESPONSE"]._serialized_end = 1260
_globals["_UPDATEPOOLGROUPREQUEST"]._serialized_start = 1263
_globals["_UPDATEPOOLGROUPREQUEST"]._serialized_end = 1391
_globals["_UPDATEPOOLGROUPRESPONSE"]._serialized_start = 1393
_globals["_UPDATEPOOLGROUPRESPONSE"]._serialized_end = 1473
_globals["_REMOVEPOOLGROUPREQUEST"]._serialized_start = 1475
_globals["_REMOVEPOOLGROUPREQUEST"]._serialized_end = 1561
_globals["_REMOVEPOOLGROUPRESPONSE"]._serialized_start = 1563
_globals["_REMOVEPOOLGROUPRESPONSE"]._serialized_end = 1588
_globals["_POOLSERVICE"]._serialized_start = 1591
_globals["_POOLSERVICE"]._serialized_end = 1998
# @@protoc_insertion_point(module_scope)

View File

@@ -1,61 +0,0 @@
# -*- coding: utf-8 -*-
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: bos/v1/units.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
b'\n\x12\x62os/v1/units.proto\x12\x0e\x62raiins.bos.v1"+\n\x0cMegaHashrate\x12\x1b\n\x13megahash_per_second\x18\x01 \x01(\x01"+\n\x0cGigaHashrate\x12\x1b\n\x13gigahash_per_second\x18\x01 \x01(\x01"+\n\x0cTeraHashrate\x12\x1b\n\x13terahash_per_second\x18\x01 \x01(\x01"\x1a\n\tFrequency\x12\r\n\x05hertz\x18\x01 \x01(\x01"\x17\n\x07Voltage\x12\x0c\n\x04volt\x18\x01 \x01(\x01"\x15\n\x05Power\x12\x0c\n\x04watt\x18\x01 \x01(\x04"-\n\x0fPowerEfficiency\x12\x1a\n\x12joule_per_terahash\x18\x01 \x01(\x01"\x1f\n\x0bTemperature\x12\x10\n\x08\x64\x65gree_c\x18\x01 \x01(\x01"\x1a\n\x0b\x42\x61sesPoints\x12\x0b\n\x03\x62sp\x18\x01 \x01(\r"\x16\n\x05Hours\x12\r\n\x05hours\x18\x01 \x01(\rb\x06proto3'
)
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.units_pb2", _globals)
if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._options = None
_globals["_MEGAHASHRATE"]._serialized_start = 38
_globals["_MEGAHASHRATE"]._serialized_end = 81
_globals["_GIGAHASHRATE"]._serialized_start = 83
_globals["_GIGAHASHRATE"]._serialized_end = 126
_globals["_TERAHASHRATE"]._serialized_start = 128
_globals["_TERAHASHRATE"]._serialized_end = 171
_globals["_FREQUENCY"]._serialized_start = 173
_globals["_FREQUENCY"]._serialized_end = 199
_globals["_VOLTAGE"]._serialized_start = 201
_globals["_VOLTAGE"]._serialized_end = 224
_globals["_POWER"]._serialized_start = 226
_globals["_POWER"]._serialized_end = 247
_globals["_POWEREFFICIENCY"]._serialized_start = 249
_globals["_POWEREFFICIENCY"]._serialized_end = 294
_globals["_TEMPERATURE"]._serialized_start = 296
_globals["_TEMPERATURE"]._serialized_end = 327
_globals["_BASESPOINTS"]._serialized_start = 329
_globals["_BASESPOINTS"]._serialized_end = 355
_globals["_HOURS"]._serialized_start = 357
_globals["_HOURS"]._serialized_end = 379
# @@protoc_insertion_point(module_scope)

View File

@@ -1,47 +0,0 @@
# -*- coding: utf-8 -*-
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: bos/v1/work.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from ...bos.v1 import units_pb2 as bos_dot_v1_dot_units__pb2
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
b'\n\x11\x62os/v1/work.proto\x12\x0e\x62raiins.bos.v1\x1a\x12\x62os/v1/units.proto"\xef\x03\n\x0cRealHashrate\x12-\n\x07last_5s\x18\x01 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12.\n\x08last_15s\x18\x02 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12.\n\x08last_30s\x18\x03 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12-\n\x07last_1m\x18\x04 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12-\n\x07last_5m\x18\x05 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12.\n\x08last_15m\x18\x06 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12.\n\x08last_30m\x18\x07 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12-\n\x07last_1h\x18\x08 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12.\n\x08last_24h\x18\t \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12\x33\n\rsince_restart\x18\n \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate"\xde\x01\n\x0fWorkSolverStats\x12\x33\n\rreal_hashrate\x18\x01 \x01(\x0b\x32\x1c.braiins.bos.v1.RealHashrate\x12\x36\n\x10nominal_hashrate\x18\x02 \x01(\x0b\x32\x1c.braiins.bos.v1.GigaHashrate\x12\x34\n\x0e\x65rror_hashrate\x18\x03 \x01(\x0b\x32\x1c.braiins.bos.v1.MegaHashrate\x12\x14\n\x0c\x66ound_blocks\x18\x04 \x01(\r\x12\x12\n\nbest_share\x18\x05 \x01(\x04\x62\x06proto3'
)
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.v1.work_pb2", _globals)
if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._options = None
_globals["_REALHASHRATE"]._serialized_start = 58
_globals["_REALHASHRATE"]._serialized_end = 553
_globals["_WORKSOLVERSTATS"]._serialized_start = 556
_globals["_WORKSOLVERSTATS"]._serialized_end = 778
# @@protoc_insertion_point(module_scope)

View File

@@ -1,47 +0,0 @@
# -*- coding: utf-8 -*-
# ------------------------------------------------------------------------------
# Copyright 2022 Upstream Data Inc -
# -
# Licensed under the Apache License, Version 2.0 (the "License"); -
# you may not use this file except in compliance with the License. -
# You may obtain a copy of the License at -
# -
# http://www.apache.org/licenses/LICENSE-2.0 -
# -
# Unless required by applicable law or agreed to in writing, software -
# distributed under the License is distributed on an "AS IS" BASIS, -
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -
# See the License for the specific language governing permissions and -
# limitations under the License. -
# ------------------------------------------------------------------------------
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: bos/version.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(
b'\n\x11\x62os/version.proto\x12\x0b\x62raiins.bos"U\n\nApiVersion\x12\r\n\x05major\x18\x01 \x01(\x04\x12\r\n\x05minor\x18\x02 \x01(\x04\x12\r\n\x05patch\x18\x03 \x01(\x04\x12\x0b\n\x03pre\x18\x04 \x01(\t\x12\r\n\x05\x62uild\x18\x05 \x01(\t"\x13\n\x11\x41piVersionRequest2]\n\x11\x41piVersionService\x12H\n\rGetApiVersion\x12\x1e.braiins.bos.ApiVersionRequest\x1a\x17.braiins.bos.ApiVersionb\x06proto3'
)
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, "bos.version_pb2", _globals)
if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._options = None
_globals["_APIVERSION"]._serialized_start = 34
_globals["_APIVERSION"]._serialized_end = 119
_globals["_APIVERSIONREQUEST"]._serialized_start = 121
_globals["_APIVERSIONREQUEST"]._serialized_end = 140
_globals["_APIVERSIONSERVICE"]._serialized_start = 142
_globals["_APIVERSIONSERVICE"]._serialized_end = 235
# @@protoc_insertion_point(module_scope)

View File

@@ -0,0 +1,80 @@
# Generated by the protocol buffer compiler. DO NOT EDIT!
# sources: bos/version.proto
# plugin: python-betterproto
# This file has been @generated
from dataclasses import dataclass
from typing import (
TYPE_CHECKING,
Dict,
Optional,
)
import betterproto
import grpclib
from betterproto.grpc.grpclib_server import ServiceBase
if TYPE_CHECKING:
import grpclib.server
from betterproto.grpc.grpclib_client import MetadataLike
from grpclib.metadata import Deadline
@dataclass(eq=False, repr=False)
class ApiVersion(betterproto.Message):
"""LATEST_API_VERSION=1.0.0-beta.4"""
major: int = betterproto.uint64_field(1)
minor: int = betterproto.uint64_field(2)
patch: int = betterproto.uint64_field(3)
pre: str = betterproto.string_field(4)
build: str = betterproto.string_field(5)
@dataclass(eq=False, repr=False)
class ApiVersionRequest(betterproto.Message):
pass
class ApiVersionServiceStub(betterproto.ServiceStub):
async def get_api_version(
self,
api_version_request: "ApiVersionRequest",
*,
timeout: Optional[float] = None,
deadline: Optional["Deadline"] = None,
metadata: Optional["MetadataLike"] = None
) -> "ApiVersion":
return await self._unary_unary(
"/braiins.bos.ApiVersionService/GetApiVersion",
api_version_request,
ApiVersion,
timeout=timeout,
deadline=deadline,
metadata=metadata,
)
class ApiVersionServiceBase(ServiceBase):
async def get_api_version(
self, api_version_request: "ApiVersionRequest"
) -> "ApiVersion":
raise grpclib.GRPCError(grpclib.const.Status.UNIMPLEMENTED)
async def __rpc_get_api_version(
self, stream: "grpclib.server.Stream[ApiVersionRequest, ApiVersion]"
) -> None:
request = await stream.recv_message()
response = await self.get_api_version(request)
await stream.send_message(response)
def __mapping__(self) -> Dict[str, grpclib.const.Handler]:
return {
"/braiins.bos.ApiVersionService/GetApiVersion": grpclib.const.Handler(
self.__rpc_get_api_version,
grpclib.const.Cardinality.UNARY_UNARY,
ApiVersionRequest,
ApiVersion,
),
}

File diff suppressed because it is too large Load Diff

View File

@@ -20,8 +20,9 @@ from typing import Union
import httpx
from pyasic import settings
from pyasic.web import BaseWebAPI
from pyasic.errors import APIError, APIWarning
from pyasic.web import BaseWebAPI
class ePICWebAPI(BaseWebAPI):
def __init__(self, ip: str) -> None:
@@ -31,22 +32,24 @@ class ePICWebAPI(BaseWebAPI):
self.token = None
async def send_command(
self,
command: Union[str, bytes],
ignore_errors: bool = False,
allow_warning: bool = True,
post: bool = False,
**parameters: Union[str, int, bool],
self,
command: Union[str, bytes],
ignore_errors: bool = False,
allow_warning: bool = True,
post: bool = False,
**parameters: Union[str, int, bool],
) -> dict:
if post or parameters != {}:
post = True
async with httpx.AsyncClient(transport=settings.transport()) as client:
for i in range(settings.get("get_data_retries", 1) + 1):
try:
if post:
epic_param = {"param": parameters.get("parameters"),
"password": self.pwd}
epic_param = {
"param": parameters.get("parameters"),
"password": self.pwd,
}
response = await client.post(
f"http://{self.ip}:4028/{command}",
timeout=5,
@@ -56,14 +59,17 @@ class ePICWebAPI(BaseWebAPI):
response = await client.get(
f"http://{self.ip}:4028/{command}",
timeout=5,
)
if not response.status_code == 200:
continue
json_data = response.json()
if json_data:
# The API can return a fail status if the miner cannot return the requested data. Catch this and pass
if "result" in json_data and json_data["result"] is False and not post:
if (
"result" in json_data
and json_data["result"] is False
and not post
):
if not i > settings.get("get_data_retries", 1):
continue
if not ignore_errors:
@@ -102,13 +108,12 @@ class ePICWebAPI(BaseWebAPI):
async def summary(self):
return await self.send_command("summary")
async def hashrate(self):
return await self.send_command("hashrate")
async def network(self):
return await self.send_command("network")
async def capabilities(self):
return await self.send_command("capabilities")

View File

@@ -1,6 +1,6 @@
[tool.poetry]
name = "pyasic"
version = "0.42.2"
version = "0.43.1"
description = "A simplified and standardized interface for Bitcoin ASICs."
authors = ["UpstreamData <brett@upstreamdata.ca>"]
repository = "https://github.com/UpstreamData/pyasic"
@@ -15,6 +15,7 @@ grpc-requests = "^0.1.12"
passlib = "^1.7.4"
pyaml = "^23.9.7"
toml = "^0.10.2"
betterproto = "2.0.0b6"
[tool.poetry.group.dev]
optional = true