Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UI dauerhaft nicht mehr erreichbar und Ladesteuerung hängt nach kurzzeitigem Ausfall von Tibber #17925

Closed
1 task done
foto-andreas opened this issue Dec 28, 2024 · 40 comments
Labels
question Rather clarification than issue

Comments

@foto-andreas
Copy link
Contributor

Describe the bug

Es gab letzte Nacht mal wieder einen vermutlich kurzzeitigen Ausfall der Tibber-API, so dass der Zugriff zu den Pulse-Daten für EVCC vorübergehend nicht möglich war. Die gleiche Situation hatte ich neulich schon einmal, jetzt konnte ich aber zumindest ein paar Daten aus dem Log einsammeln.

Es waren zeitgleich beide EVCC-Instanzen betroffen. Das Log ist nicht ganz gleich. Es gibt beim Ausfall einen Unterschied. Siehe unten.

Auf EVCC-Seite sieht das dann so aus, dass die UI nicht mehr erreichbar ist und "404 page not found" angezeigt wird. Beide Instanzen gleich. Im Log steht weiterhin, dass die GRID-Daten outdated seien. Allerdings sind die Daten in der Tibber-App sichtbar und aktiv. Außerdem führt ein Neustart in 4 von 4 Fällen sofort zu einem funktionierenden EVCC.

Noch blöder an der ganzen Sache ist aber, dass EVCC dann die Ladung nicht freigibt. Geplante Ladung von 4:00 bis 6:00. Es wurde nicht geladen. Die gesamte Ladesteuerung von EVCC ist offenbar bis zum Restart auf Eis gelegt, weil EVCC nicht merkt, dass die Tibber-Pulse-Daten wieder zur Verfügung stünden.

Bitte baut eine Absicherung ein, dass sich die Steuerung wieder sortiert, so dass auf alle Fälle wie geplant geladen wird und die UI erreichbar ist.

Steps to reproduce

  1. Tibber Pulse nutzen
    2.Tibber-Api lahm legen :)
  2. Tibber-Api wieder fit machen
  3. UI von EVCC ansehen

Ich hab keine Ahnung, ob man wirklich die gleiche Situation wie beschrieben nachstellen kann.

Ich hab zwar unten ankreuzen müssen, dass ich es mit nightly nachstellen kann, da Pflichtfeld, kann ich aber natürlich nicht.

Configuration details

log: debug

interval: 15s

sponsortoken: "{{ evcc.sponsor_token }}"

tariffs:
  currency: EUR
  grid:
    type: tibber
    token: "{{ tibber.api_token }}"
    homeid: "{{ tibber.home_id_wk }}"
  feedin:
    type: fixed
    price: 0.1231

meters:
  - name: tibber
    type: template
    template: tibber-pulse
    usage: grid
    token: "{{ tibber.api_token }}"
    homeid: "{{ tibber.home_id_wk }}"

  - name: pv1
    type: template
    template: solaredge-hybrid
    usage: pv
    modbus: tcpip
    host: "{{ pv.host }}"
    port: 502
    timeout: 5s

vehicles:
  - name: zoe
    type: template
    template: renault
    title: Zoe
    user: "{{ zoe.user }}"
    password: "{{ zoe.password }}"
    vin: "{{zoe.vin}}"
    capacity: 52
    identifiers: {{ zoe.identifiers }}
    mode: pv
    minCurrent: 8

  - name: silvie
    type: template
    template: hyundai
    title: Kona Silvie
    user: "{{ silvie.user }}"
    password: "{{ silvie.password }}"
    vin: "{{ silvie.vin }}"
    capacity: 64
    language: de
    identifiers: {{ silvie.identifiers }}
    mode: pv
    minCurrent: 6

chargers:
  - type: template
    template: easee
    user: "{{ easee.user }}"
    password: "{{ easee.password }}"
    charger: "{{ easee.chargers.wk }}"
    name: easee_zuhause

loadpoints:
  - title: Zuhause Wallbox
    charger: easee_zuhause
    mode: pv
    soc:
      poll:
        mode: connected
        interval: 30m
      estimate: true
    enable:
      delay: 60s
    disable:
      delay: 60s

mqtt:
  broker: tls://mqtt.....
  topic: evcc
  user: mqtt
  password: "{{ mqtt.password }}"

site:
  title: Zuhause
  meters:
    grid: tibber
    pv: pv1

plant: XXXXXX

#messaging:
#  services:
#    - type: telegram
#      token: "{{ telegram.token }}"
#      chats: {{ telegram.chats }}

Log details

Bis hierhin alles gut:

evcc-wk  | [lp-1  ] DEBUG 2024/12/28 00:30:27 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 00:30:27 charge currents: [0.009 0.006 0.007]A
evcc-wk  | [site  ] DEBUG 2024/12/28 00:30:27 grid power: 200W
evcc-wk  | [site  ] DEBUG 2024/12/28 00:30:27 grid currents: [0 0 0]A
evcc-wk  | [site  ] DEBUG 2024/12/28 00:30:27 pv power: 0W
evcc-wk  | [site  ] DEBUG 2024/12/28 00:30:27 site power: 200W
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 00:30:27 charge total import: 11197.181kWh
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 00:30:27 charger status: B
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 00:30:27 next soc poll remaining time: 5m29s
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 00:30:27 plan: charge 1h54m45s between 2024-12-28 04:00:00 +0100 CET until 2024-12-28 06:00:00 +0100 CET (power: 11040W, avg c>
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 00:30:27 !! pvScalePhases DOWN activePhases: 1, available power: -200W, scalable: false
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 00:30:27 !! pvScalePhases UP maxPhases: 3, available power: -200W, scalable: false
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 00:30:27 !! pvScalePhases EXIT
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 00:30:27 pv charge current: 0A = 0A + -0.87A (200W @ 1p)
evcc-wk  | [site  ] DEBUG 2024/12/28 00:30:42 ----
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 00:30:42 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 00:30:42 charge currents: [0.009 0.006 0.007]A
evcc-wk  | [site  ] DEBUG 2024/12/28 00:30:42 pv power: 0W
evcc-wk  | [site  ] ERROR 2024/12/28 00:30:43 grid power: outdated

Einmal zwischendurch:

evcc-wk  | [lp-1  ] DEBUG 2024/12/28 04:29:12 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 04:29:12 charge currents: [0.008 0.007 0.007]A
evcc-wk  | [site  ] ERROR 2024/12/28 04:29:18 pv 1 power: add[0]: model 103 block 0 point DCW: dial tcp 87.169.86.137:502: connect: connection refused
evcc-wk  | [site  ] DEBUG 2024/12/28 04:29:18 pv power: 0W
evcc-wk  | [site  ] ERROR 2024/12/28 04:29:18 grid power: outdated

Dauerhaft bis zum Restart:

evcc-wk  | [lp-1  ] DEBUG 2024/12/28 06:36:42 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 06:36:42 charge currents: [0 0 0]A
evcc-wk  | [site  ] DEBUG 2024/12/28 06:36:42 pv power: 0W
evcc-wk  | [site  ] ERROR 2024/12/28 06:36:42 grid power: outdated


Nach Restart:

evcc-wk  | [main  ] INFO 2024/12/28 06:41:38 evcc 0.131.12
evcc-wk  | [main  ] INFO 2024/12/28 06:41:38 using config file: /etc/evcc.yaml
evcc-wk  | [db    ] INFO 2024/12/28 06:41:38 using sqlite database: /root/.evcc/evcc.db
evcc-wk  | [mqtt  ] INFO 2024/12/28 06:41:38 connecting evcc-2003790081 at tls://mqtt.schrell.de:8883
evcc-wk  | [mqtt  ] DEBUG 2024/12/28 06:41:38 tls://mqtt.schrell.de:8883 connected
evcc-wk  | [main  ] INFO 2024/12/28 06:41:38 listening at :7070
evcc-wk  | [lp-1  ] WARN 2024/12/28 06:41:42 poll mode '{connected 30m0s}' may deplete your battery or lead to API misuse. USE AT YOUR OWN RISK.
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 06:41:42 charge total import: 11197.776kWh
evcc-wk  | [site  ] INFO 2024/12/28 06:41:43 site config:
evcc-wk  | [site  ] INFO 2024/12/28 06:41:43   meters:      grid ✓ pv ✓ battery ✗
evcc-wk  | [site  ] INFO 2024/12/28 06:41:43     grid:      power ✓ energy ✗ currents ✓
evcc-wk  | [site  ] INFO 2024/12/28 06:41:43     pv 1:      power ✓ energy ✗ currents ✗
evcc-wk  | [site  ] INFO 2024/12/28 06:41:43   vehicles:
evcc-wk  | [site  ] INFO 2024/12/28 06:41:43     vehicle 1: range ✓ finish ✓ status ✓ climate ✗ wakeup ✓
evcc-wk  | [site  ] INFO 2024/12/28 06:41:43     vehicle 2: range ✓ finish ✓ status ✓ climate ✓ wakeup ✓
evcc-wk  | [lp-1  ] INFO 2024/12/28 06:41:43 loadpoint 1:
evcc-wk  | [lp-1  ] INFO 2024/12/28 06:41:43   mode:        pv
evcc-wk  | [lp-1  ] INFO 2024/12/28 06:41:43   charger:     power ✓ energy ✓ currents ✓ phases ✓ wakeup ✗
evcc-wk  | [lp-1  ] INFO 2024/12/28 06:41:43   meters:      charge ✓
evcc-wk  | [lp-1  ] INFO 2024/12/28 06:41:43     charge:    power ✓ energy ✓ currents ✓
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 06:41:43 set smart cost limit: 0.1
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 06:41:43 phase timer inactive
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 06:41:43 pv timer inactive
evcc-wk  | [site  ] WARN 2024/12/28 06:41:43 interval <30s can lead to unexpected behavior, see https://docs.evcc.io/docs/reference/configuration/interval
evcc-wk  | [site  ] DEBUG 2024/12/28 06:41:43 ----
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 06:41:43 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2024/12/28 06:41:43 charge currents: [0 0 0]A
evcc-wk  | [site  ] DEBUG 2024/12/28 06:41:44 pv power: 0W



Zweite EVCC-Instanz:

evcc-py  | [lp-1  ] DEBUG 2024/12/28 00:29:32 charge power: 0W
evcc-py  | [lp-1  ] DEBUG 2024/12/28 00:29:32 charge currents: [0 0 0]A
evcc-py  | [site  ] DEBUG 2024/12/28 00:29:32 grid power: 303W
evcc-py  | [site  ] DEBUG 2024/12/28 00:29:32 grid currents: [0 0 0]A
evcc-py  | [site  ] DEBUG 2024/12/28 00:29:32 pv power: 0W
evcc-py  | [site  ] DEBUG 2024/12/28 00:29:32 site power: 303W
evcc-py  | [lp-1  ] DEBUG 2024/12/28 00:29:32 charge total import: 5104.054kWh
evcc-py  | [lp-1  ] DEBUG 2024/12/28 00:29:32 charger status: A
evcc-py  | [pulse ] ERROR 2024/12/28 00:29:41 Message: unable to start stream api:e90e4c627d25417383ba9b4cb7821497 for device e90e4c62-7d25-4173-83ba-9b4cb7821497;>
evcc-py  | [pulse ] ERROR 2024/12/28 00:29:42 Message: unable to start stream api:e90e4c627d25417383ba9b4cb7821497 for device e90e4c62-7d25-4173-83ba-9b4cb7821497;>
evcc-py  | [pulse ] ERROR 2024/12/28 00:29:42 Message: unable to start stream api:e90e4c627d25417383ba9b4cb7821497 for device e90e4c62-7d25-4173-83ba-9b4cb7821497;>
evcc-py  | [pulse ] ERROR 2024/12/28 00:29:43 Message: unable to start stream api:e90e4c627d25417383ba9b4cb7821497 for device e90e4c62-7d25-4173-83ba-9b4cb7821497;>
evcc-py  | [site  ] DEBUG 2024/12/28 00:29:47 ----
evcc-py  | [lp-1  ] DEBUG 2024/12/28 00:29:47 charge power: 0W
evcc-py  | [lp-1  ] DEBUG 2024/12/28 00:29:47 charge currents: [0 0 0]A
evcc-py  | [site  ] DEBUG 2024/12/28 00:29:47 grid power: 303W
evcc-py  | [site  ] DEBUG 2024/12/28 00:29:47 grid currents: [0 0 0]A
evcc-py  | [site  ] DEBUG 2024/12/28 00:29:47 pv power: 0W
evcc-py  | [site  ] DEBUG 2024/12/28 00:29:47 site power: 303W
evcc-py  | [lp-1  ] DEBUG 2024/12/28 00:29:47 charge total import: 5104.054kWh
evcc-py  | [lp-1  ] DEBUG 2024/12/28 00:29:47 charger status: A
evcc-py  | [pulse ] ERROR 2024/12/28 00:29:48 Message: unable to start stream api:e90e4c627d25417383ba9b4cb7821497 for device e90e4c62-7d25-4173-83ba-9b4cb7821497;>
evcc-py  | [pulse ] ERROR 2024/12/28 00:29:49 Message: unable to start stream api:e90e4c627d25417383ba9b4cb7821497 for device e90e4c62-7d25-4173-83ba-9b4cb7821497;>
evcc-py  | [site  ] DEBUG 2024/12/28 00:30:02 ----
evcc-py  | [lp-1  ] DEBUG 2024/12/28 00:30:02 charge power: 0W
evcc-py  | [lp-1  ] DEBUG 2024/12/28 00:30:02 charge currents: [0 0 0]A
evcc-py  | [site  ] DEBUG 2024/12/28 00:30:02 grid power: 303W
evcc-py  | [site  ] DEBUG 2024/12/28 00:30:02 grid currents: [0 0 0]A
evcc-py  | [site  ] DEBUG 2024/12/28 00:30:02 pv power: 0W
evcc-py  | [site  ] DEBUG 2024/12/28 00:30:02 site power: 303W
evcc-py  | [lp-1  ] DEBUG 2024/12/28 00:30:02 charge total import: 5104.054kWh
evcc-py  | [lp-1  ] DEBUG 2024/12/28 00:30:02 charger status: A

What type of operating system are you running?

Docker container

Nightly build

  • I have verified that the issue is reproducible with the latest nightly build

Version

0.131.12

@andig
Copy link
Member

andig commented Dec 28, 2024

evcc läuft nicht ohne funktionierenden Netzzähler. Wenn Tibber zu unzuverlässig ist muss es aus der Position raus :(

@andig andig closed this as completed Dec 28, 2024
@andig andig added the question Rather clarification than issue label Dec 28, 2024
@andig
Copy link
Member

andig commented Dec 28, 2024

Auf EVCC-Seite sieht das dann so aus, dass die UI nicht mehr erreichbar ist und "404 page not found" angezeigt wird.

Bitte evcc mit --profile starten und- wenn das passiert- bitte das Go routine Profil von /debug/pprof und Logfile in dem Moment hier ergänzen. 404 sollte tatsächlich nicht sein.

@foto-andreas
Copy link
Contributor Author

"--profile" ist eingebaut.

@foto-andreas
Copy link
Contributor Author

Hm, evcc läuft mit --profile. UI ist nicht erreichbar wie oben aber es gibt kein Profile. Die Situation kann ich vorübergehend einfach mal bestehen lassen bis heute Nacht, einen der beiden Container auch länger. Wenn du Anweisungen erteilen kannst, was ich tun / probieren könnte, her damit. Bin heute in Kundenterminen, ab etwa 18 Uhr kann ich was dran probieren.

@andig
Copy link
Member

andig commented Jan 8, 2025

aber es gibt kein Profile

Wirklich? Das wäre das erste mal.

@foto-andreas
Copy link
Contributor Author

Wird das nur beim Crash geschrieben oder auch, wenn evcc weiterläuft? Ich hab den gesamten Container abgesucht nach profile und debug aber nix passendes gefunden. --profile bekommt keinen zusätzlichen Parameter, oder?

@foto-andreas
Copy link
Contributor Author

foto-andreas commented Jan 8, 2025

evcc-wk  | [site  ] DEBUG 2025/01/07 10:25:50 ----
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:25:50 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:25:50 charge currents: [0 0 0]A
evcc-wk  | [site  ] DEBUG 2025/01/07 10:25:50 grid power: 347W
evcc-wk  | [site  ] DEBUG 2025/01/07 10:25:50 grid currents: [0 0 0]A
evcc-wk  | [site  ] DEBUG 2025/01/07 10:25:51 pv power: 13W
evcc-wk  | [site  ] DEBUG 2025/01/07 10:25:51 site power: 347W
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:25:51 charge total import: 11436.146kWh
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:25:51 charger status: A
evcc-wk  | [pulse ] ERROR 2025/01/07 10:26:05 Message: unable to start stream api:a7a10770336a4d0198ecea09c28da682 for device a7a10770-336a-4d01-98ec-x; userId 29fbf633-dee7-4959-9d19-x
d5c97b, Locations: [{Line:1 Column:28}], Extensions: map[], Path: [liveMeasurement]
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:05 ----
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:05 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:05 charge currents: [0 0 0]A
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:05 grid power: 347W
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:05 grid currents: [0 0 0]A
evcc-wk  | [pulse ] ERROR 2025/01/07 10:26:06 Message: unable to start stream api:a7a10770336a4d0198ecea09c28da682 for device a7a10770-336a-4d01-98ec-x; userId 29fbf633-dee7-4959-9d19-x
d5c97b, Locations: [{Line:1 Column:28}], Extensions: map[], Path: [liveMeasurement]
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:06 pv power: 12W
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:06 site power: 347W
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:06 charge total import: 11436.146kWh
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:06 charger status: A
evcc-wk  | [pulse ] ERROR 2025/01/07 10:26:06 Message: unable to start stream api:a7a10770336a4d0198ecea09c28da682 for device a7a10770-336a-4d01-98ec-x; userId 29fbf633-dee7-4959-9d19-x
d5c97b, Locations: [{Line:1 Column:28}], Extensions: map[], Path: [liveMeasurement]
evcc-wk  | [pulse ] ERROR 2025/01/07 10:26:07 Message: unable to start stream api:a7a10770336a4d0198ecea09c28da682 for device a7a10770-336a-4d01-98ec-x; userId x
d5c97b, Locations: [{Line:1 Column:28}], Extensions: map[], Path: [liveMeasurement]
evcc-wk  | [pulse ] ERROR 2025/01/07 10:26:07 Message: unable to start stream api:a7a10770336a4d0198ecea09c28da682 for device a7a10770-336a-4d01-98ec-x; userId 29fbf633-dee7-4959-9d19-x
d5c97b, Locations: [{Line:1 Column:28}], Extensions: map[], Path: [liveMeasurement]
evcc-wk  | [pulse ] ERROR 2025/01/07 10:26:07 Message: unable to start stream api:x for device a7a10770-336a-4d01-98ec-ea09c28da682; userId 29fbf633-dee7-4959-9d19-x
d5c97b, Locations: [{Line:1 Column:28}], Extensions: map[], Path: [liveMeasurement]
evcc-wk  | [pulse ] ERROR 2025/01/07 10:26:08 Message: unable to start stream api:a7a10770336a4d0198ecea09c28da682 for device a7a10770-336a-4d01-98ec-x; userId 29fbf633-dee7-4959-9d19-x
d5c97b, Locations: [{Line:1 Column:28}], Extensions: map[], Path: [liveMeasurement]
evcc-wk  | [pulse ] ERROR 2025/01/07 10:26:08 Message: unable to start stream api:a7a10770336a4d0198ecea09c28da682 for device a7a10770-336a-4d01-98ec-x; userId 29fbf633-dee7-4959-9d19-x
d5c97b, Locations: [{Line:1 Column:28}], Extensions: map[], Path: [liveMeasurement]
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:20 ----
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:20 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:20 charge currents: [0 0 0]A
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:20 grid power: 347W
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:20 grid currents: [0 0 0]A
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:21 pv power: 11W
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:21 site power: 347W
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:21 charge total import: 11436.146kWh
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:21 charger status: A
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:35 ----
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:35 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:35 charge currents: [0 0 0]A
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:35 grid power: 347W
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:35 grid currents: [0 0 0]A
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:36 pv power: 10W
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:36 site power: 347W
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:36 charge total import: 11436.146kWh
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:36 charger status: A
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:50 ----
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:50 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:26:50 charge currents: [0 0 0]A
evcc-wk  | [site  ] DEBUG 2025/01/07 10:26:51 pv power: 18W
evcc-wk  | [site  ] ERROR 2025/01/07 10:26:51 grid power: outdated
evcc-wk  | [site  ] DEBUG 2025/01/07 10:27:05 ----
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:27:05 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2025/01/07 10:27:05 charge currents: [0 0 0]A
evcc-wk  | [site  ] DEBUG 2025/01/07 10:27:06 pv power: 13W
evcc-wk  | [site  ] ERROR 2025/01/07 10:27:06 grid power: outdated

@foto-andreas
Copy link
Contributor Author

... oder muss das Profile-Verzeichnis vorher angelegt werden?

@foto-andreas
Copy link
Contributor Author

Muss jetzt leider erstmal los. Passiert das bei keinem anderen der Tibber-Evcc-Leute? Hätte da jetzt eigentlich ein globaleres Problem erwartet, da bei mir auch immer beide Hütten betrofen sind.

@foto-andreas
Copy link
Contributor Author

Hab mal versucht, mit delve zu debuggen. Das funktioniert aber im laufenden Container nicht, da der nicht privilegiert ist, daher /proc/sys ro gemounted ist und auch vom root-user nicht remounted werden kann. So gan ohne Erfahrung mit go komme ich da erstmal nicht alleine weiter

@andig
Copy link
Member

andig commented Jan 8, 2025

Einfach /debug/pprof im Browser aufrufen.

@foto-andreas
Copy link
Contributor Author

Ach so, ich war im Dateisystem unterwegs.... Ergebnis ist leider nicht wie erwartet:

/app # curl -vvvv http://localhost:7070/debug/pprof
17:56:55.080976 [0-x] == Info: [READ] client_reset, clear readers
17:56:55.081158 [0-0] == Info: Host localhost:7070 was resolved.
17:56:55.081207 [0-0] == Info: IPv6: ::1
17:56:55.081235 [0-0] == Info: IPv4: 127.0.0.1
17:56:55.081268 [0-0] == Info: [SETUP] added
17:56:55.081303 [0-0] == Info: [HAPPY-EYEBALLS] created ipv6 (timeout 300000ms)
17:56:55.081355 [0-0] == Info: [HAPPY-EYEBALLS] created ipv4 (timeout 300000ms)
17:56:55.081417 [0-0] == Info: [HAPPY-EYEBALLS] ipv6 starting (timeout=300000ms)
17:56:55.081495 [0-0] == Info:   Trying [::1]:7070...
17:56:55.081540 [0-0] == Info: [TCP] cf_socket_open() -> 0, fd=4
17:56:55.082343 [0-0] == Info: [TCP] local address ::1 port 48274...
17:56:55.082760 [0-0] == Info: [HAPPY-EYEBALLS] ipv6 connect -> 0, connected=0
17:56:55.082851 [0-0] == Info: [TCP] adjust_pollset, !connected, POLLOUT fd=4
17:56:55.082927 [0-0] == Info: [HAPPY-EYEBALLS] adjust_pollset -> 1 socks
17:56:55.083082 [0-0] == Info: [TCP] connected
17:56:55.083130 [0-0] == Info: [HAPPY-EYEBALLS] ipv6 connect -> 0, connected=1
17:56:55.083210 [0-0] == Info: [HAPPY-EYEBALLS] Connected to localhost (::1) port 7070
17:56:55.083296 [0-0] == Info: Connected to localhost (::1) port 7070
17:56:55.083365 [0-0] == Info: using HTTP/1.x
17:56:55.083497 [0-0] == Info: [TCP] send(len=89) -> 89, err=0
17:56:55.083560 [0-0] => Send header, 89 bytes (0x59)
0000: GET /debug/pprof HTTP/1.1
001b: Host: localhost:7070
0031: User-Agent: curl/8.11.1
004a: Accept: */*
0057: 
17:56:55.083943 [0-0] == Info: [TCP] recv(len=102400) -> 204, err=0
17:56:55.084079 [0-0] <= Recv header, 32 bytes (0x20)
0000: HTTP/1.1 301 Moved Permanently
17:56:55.084178 [0-0] == Info: [WRITE] cw_out, wrote 32 header bytes -> 32
17:56:55.084252 [0-0] == Info: [WRITE] download_write header(type=c, blen=32) -> 0
17:56:55.084331 [0-0] == Info: [WRITE] client_write(type=c, len=32) -> 0
17:56:55.084415 [0-0] <= Recv header, 40 bytes (0x28)
0000: Content-Type: text/html; charset=utf-8
17:56:55.084526 [0-0] == Info: [WRITE] header_collect pushed(type=1, len=40) -> 0
17:56:55.084614 [0-0] == Info: [WRITE] cw_out, wrote 40 header bytes -> 40
17:56:55.084691 [0-0] == Info: [WRITE] download_write header(type=4, blen=40) -> 0
17:56:55.084771 [0-0] == Info: [WRITE] client_write(type=4, len=40) -> 0
17:56:55.084851 [0-0] <= Recv header, 25 bytes (0x19)
0000: Location: /debug/pprof/
17:56:55.084936 [0-0] == Info: [WRITE] header_collect pushed(type=1, len=25) -> 0
17:56:55.085098 [0-0] == Info: [WRITE] cw_out, wrote 25 header bytes -> 25
17:56:55.085169 [0-0] == Info: [WRITE] download_write header(type=4, blen=25) -> 0
17:56:55.085249 [0-0] == Info: [WRITE] client_write(type=4, len=25) -> 0
17:56:55.085318 [0-0] <= Recv header, 37 bytes (0x25)
0000: Date: Wed, 08 Jan 2025 16:56:55 GMT
17:56:55.085415 [0-0] == Info: [WRITE] header_collect pushed(type=1, len=37) -> 0
17:56:55.085496 [0-0] == Info: [WRITE] cw_out, wrote 37 header bytes -> 37
17:56:55.085569 [0-0] == Info: [WRITE] download_write header(type=4, blen=37) -> 0
17:56:55.085647 [0-0] == Info: [WRITE] client_write(type=4, len=37) -> 0
17:56:55.085719 [0-0] <= Recv header, 20 bytes (0x14)
0000: Content-Length: 48
17:56:55.085799 [0-0] == Info: [WRITE] header_collect pushed(type=1, len=20) -> 0
17:56:55.085878 [0-0] == Info: [WRITE] cw_out, wrote 20 header bytes -> 20
17:56:55.085952 [0-0] == Info: [WRITE] download_write header(type=4, blen=20) -> 0
17:56:55.086050 [0-0] == Info: [WRITE] client_write(type=4, len=20) -> 0
17:56:55.086128 [0-0] <= Recv header, 2 bytes (0x2)
0000: 
17:56:55.086186 [0-0] == Info: [WRITE] header_collect pushed(type=1, len=2) -> 0
17:56:55.086264 [0-0] == Info: [WRITE] cw_out, wrote 2 header bytes -> 2
17:56:55.086335 [0-0] == Info: [WRITE] download_write header(type=4, blen=2) -> 0
17:56:55.086412 [0-0] == Info: [WRITE] client_write(type=4, len=2) -> 0
17:56:55.086481 [0-0] <= Recv data, 48 bytes (0x30)
0000: <a href="/debug/pprof/">Moved Permanently</a>...
<a href="/debug/pprof/">Moved Permanently</a>.

17:56:55.086591 [0-0] == Info: [WRITE] cw_out, wrote 48 body bytes -> 48
17:56:55.086661 [0-0] == Info: [WRITE] download_write body(type=1, blen=48) -> 0
17:56:55.086737 [0-0] == Info: [WRITE] client_write(type=1, len=48) -> 0
17:56:55.086806 [0-0] == Info: [WRITE] xfer_write_resp(len=204, eos=0) -> 0
17:56:55.086886 [0-0] == Info: [WRITE] cw-out is notpaused
17:56:55.086964 [0-0] == Info: [WRITE] cw-out done
17:56:55.087041 [0-0] == Info: [READ] client_reset, clear readers
17:56:55.087119 [0-0] == Info: Connection #0 to host localhost left intact

@foto-andreas
Copy link
Contributor Author

<title>/debug/pprof/</title> <style> .profile-name{ display:inline-block; width:6rem; } </style> /debug/pprof/

Set debug=1 as a query parameter to export in legacy text format


Types of profiles available:
CountProfile
1217allocs
0block
0cmdline
49goroutine
1217heap
0mutex
0profile
13threadcreate
0trace
full goroutine stack dump

Profile Descriptions:

  • allocs:
    A sampling of all past memory allocations
  • block:
    Stack traces that led to blocking on synchronization primitives
  • cmdline:
    The command line invocation of the current program
  • goroutine:
    Stack traces of all current goroutines. Use debug=2 as a query parameter to export in the same format as an unrecovered panic.
  • heap:
    A sampling of memory allocations of live objects. You can specify the gc GET parameter to run GC before taking the heap sample.
  • mutex:
    Stack traces of holders of contended mutexes
  • profile:
    CPU profile. You can specify the duration in the seconds GET parameter. After you get the profile file, use the go tool pprof command to investigate the profile.
  • threadcreate:
    Stack traces that led to the creation of new OS threads
  • trace:
    A trace of execution of the current program. You can specify the duration in the seconds GET parameter. After you get the trace file, use the go tool trace command to investigate the trace.

@foto-andreas
Copy link
Contributor Author

Stack dump daraus ziehen?

@foto-andreas
Copy link
Contributor Author

curl http://localhost:7070/debug/pprof/goroutine?debug=2 
goroutine 3348275 [running]:
runtime/pprof.writeGoroutineStacks({0x30f75e0, 0x4000ebc000})
	runtime/pprof/pprof.go:761 +0x6c
runtime/pprof.writeGoroutine({0x30f75e0?, 0x4000ebc000?}, 0x4001d13708?)
	runtime/pprof/pprof.go:750 +0x2c
runtime/pprof.(*Profile).WriteTo(0x54f2130?, {0x30f75e0?, 0x4000ebc000?}, 0xc?)
	runtime/pprof/pprof.go:374 +0x148
net/http/pprof.handler.ServeHTTP({0x40013ae011, 0x9}, {0x3110d88, 0x4000ebc000}, 0x4002222280)
	net/http/pprof/pprof.go:272 +0x444
net/http/pprof.Index({0x3110d88, 0x4000ebc000}, 0x4002222280?)
	net/http/pprof/pprof.go:388 +0xc8
net/http.HandlerFunc.ServeHTTP(0x5549340?, {0x3110d88?, 0x4000ebc000?}, 0x4000fc6be0?)
	net/http/server.go:2220 +0x38
net/http.(*ServeMux).ServeHTTP(0x4002222140?, {0x3110d88, 0x4000ebc000}, 0x4002222280)
	net/http/server.go:2747 +0x1b4
github.com/gorilla/mux.(*Router).ServeHTTP(0x4001d46480, {0x3110d88, 0x4000ebc000}, 0x4002222000)
	github.com/gorilla/mux@v1.8.1/mux.go:212 +0x194
net/http.serverHandler.ServeHTTP({0x4000f173e0?}, {0x3110d88?, 0x4000ebc000?}, 0x6?)
	net/http/server.go:3210 +0xbc
net/http.(*conn).serve(0x400162a000, {0x31192a0, 0x400211ad80})
	net/http/server.go:2092 +0x4fc
created by net/http.(*Server).Serve in goroutine 1
	net/http/server.go:3360 +0x3dc

goroutine 1 [IO wait]:
internal/poll.runtime_pollWait(0xe61f0b1a59d0, 0x72)
	runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x400229a900?, 0x87?, 0x0)
	internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x400229a900)
	internal/poll/fd_unix.go:620 +0x24c
net.(*netFD).accept(0x400229a900)
	net/fd_unix.go:172 +0x28
net.(*TCPListener).accept(0x4002280c80)
	net/tcpsock_posix.go:159 +0x28
net.(*TCPListener).Accept(0x4002280c80)
	net/tcpsock.go:372 +0x2c
net/http.(*Server).Serve(0x4000bde0f0, {0x3110a28, 0x4002280c80})
	net/http/server.go:3330 +0x294
net/http.(*Server).ListenAndServe(0x4000bde0f0)
	net/http/server.go:3259 +0x84
github.com/evcc-io/evcc/cmd.runRoot(0x551cf20, {0x28583d7?, 0x7?, 0x284bf3f?})
	github.com/evcc-io/evcc/cmd/root.go:324 +0x1cd4
github.com/spf13/cobra.(*Command).execute(0x551cf20, {0x400004c090, 0x1, 0x1})
	github.com/spf13/cobra@v1.8.1/command.go:989 +0x81c
github.com/spf13/cobra.(*Command).ExecuteC(0x551cf20)
	github.com/spf13/cobra@v1.8.1/command.go:1117 +0x344
github.com/spf13/cobra.(*Command).Execute(...)
	github.com/spf13/cobra@v1.8.1/command.go:1041
github.com/evcc-io/evcc/cmd.Execute()
	github.com/evcc-io/evcc/cmd/root.go:112 +0x24
main.main()
	github.com/evcc-io/evcc/main.go:44 +0x38

goroutine 7 [chan receive]:
github.com/evcc-io/evcc/cmd.configureDatabase.func2()
	github.com/evcc-io/evcc/cmd/setup.go:541 +0x64
created by github.com/evcc-io/evcc/cmd.configureDatabase in goroutine 1
	github.com/evcc-io/evcc/cmd/setup.go:540 +0x74

goroutine 6 [select, 14673 minutes]:
database/sql.(*DB).connectionOpener(0x400018f040, {0x31192d8, 0x40001bb220})
	database/sql/sql.go:1253 +0x80
created by database/sql.OpenDB in goroutine 1
	database/sql/sql.go:833 +0x128

goroutine 33 [select, 14673 minutes]:
google.golang.org/grpc/internal/grpcsync.(*CallbackSerializer).run(0x4001cc5210, {0x31192d8, 0x4000917360})
	google.golang.org/grpc@v1.68.0/internal/grpcsync/callback_serializer.go:88 +0xc4
created by google.golang.org/grpc/internal/grpcsync.NewCallbackSerializer in goroutine 1
	google.golang.org/grpc@v1.68.0/internal/grpcsync/callback_serializer.go:52 +0x118

goroutine 38 [select]:
github.com/eclipse/paho%2emqtt%2egolang.keepalive(0x4000fb8288, {0xe61f0b12d728, 0x4000df0388})
	github.com/eclipse/paho.mqtt.golang@v1.5.0/ping.go:48 +0x190
created by github.com/eclipse/paho%2emqtt%2egolang.(*client).startCommsWorkers in goroutine 68
	github.com/eclipse/paho.mqtt.golang@v1.5.0/client.go:595 +0x8ac

goroutine 81 [chan receive]:
github.com/evcc-io/evcc/util.(*Tee).Run(0x4001ad8880, 0x400010ed20)
	github.com/evcc-io/evcc/util/tee.go:39 +0x44
created by github.com/evcc-io/evcc/cmd.runRoot in goroutine 1
	github.com/evcc-io/evcc/cmd/root.go:147 +0x33c

goroutine 39 [chan receive, 14673 minutes]:
github.com/eclipse/paho%2emqtt%2egolang.(*router).matchAndDispatch.func2()
	github.com/eclipse/paho.mqtt.golang@v1.5.0/router.go:174 +0x84
created by github.com/eclipse/paho%2emqtt%2egolang.(*router).matchAndDispatch in goroutine 68
	github.com/eclipse/paho.mqtt.golang@v1.5.0/router.go:173 +0x200

goroutine 41 [select]:
github.com/eclipse/paho%2emqtt%2egolang.(*client).startCommsWorkers.func1()
	github.com/eclipse/paho.mqtt.golang@v1.5.0/client.go:627 +0xf0
created by github.com/eclipse/paho%2emqtt%2egolang.(*client).startCommsWorkers in goroutine 68
	github.com/eclipse/paho.mqtt.golang@v1.5.0/client.go:624 +0x680

goroutine 42 [IO wait]:
internal/poll.runtime_pollWait(0xe61f0b1a64c0, 0x72)
	runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4000c92880?, 0x4000328000?, 0x0)
	internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x4000c92880, {0x4000328000, 0xd80, 0xd80})
	internal/poll/fd_unix.go:165 +0x1fc
net.(*netFD).Read(0x4000c92880, {0x4000328000?, 0x4000328082?, 0x5?})
	net/fd_posix.go:55 +0x28
net.(*conn).Read(0x4001b2eff8, {0x4000328000?, 0x4000df0548?, 0x8?})
	net/net.go:189 +0x34
crypto/tls.(*atLeastReader).Read(0x4001fae978, {0x4000328000?, 0x0?, 0x4001fae978?})
	crypto/tls/conn.go:809 +0x40
bytes.(*Buffer).ReadFrom(0x4000df0638, {0x30fbc60, 0x4001fae978})
	bytes/buffer.go:211 +0x90
crypto/tls.(*Conn).readFromUntil(0x4000df0388, {0x30f6920, 0x4001b2eff8}, 0x0?)
	crypto/tls/conn.go:831 +0xd0
crypto/tls.(*Conn).readRecordOrCCS(0x4000df0388, 0x0)
	crypto/tls/conn.go:629 +0x35c
crypto/tls.(*Conn).readRecord(...)
	crypto/tls/conn.go:591
crypto/tls.(*Conn).Read(0x4000df0388, {0x4000c7ea9c, 0x1, 0x9bb938?})
	crypto/tls/conn.go:1385 +0x164
io.ReadAtLeast({0xe61f0b12d748, 0x4000df0388}, {0x4000c7ea9c, 0x1, 0x1}, 0x1)
	io/io.go:335 +0xa0
io.ReadFull(...)
	io/io.go:354
github.com/eclipse/paho.mqtt.golang/packets.ReadPacket({0xe61f0b12d748, 0x4000df0388})
	github.com/eclipse/paho.mqtt.golang@v1.5.0/packets/packets.go:131 +0x5c
github.com/eclipse/paho%2emqtt%2egolang.startIncoming.func1()
	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:124 +0xdc
created by github.com/eclipse/paho%2emqtt%2egolang.startIncoming in goroutine 68
	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:122 +0x140

goroutine 43 [select]:
github.com/eclipse/paho%2emqtt%2egolang.startIncomingComms.func1()
	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:175 +0x134
created by github.com/eclipse/paho%2emqtt%2egolang.startIncomingComms in goroutine 68
	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:164 +0x1a4

goroutine 44 [select]:
github.com/eclipse/paho%2emqtt%2egolang.startOutgoingComms.func1()
	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:277 +0x164
created by github.com/eclipse/paho%2emqtt%2egolang.startOutgoingComms in goroutine 68
	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:264 +0x204

goroutine 45 [chan receive, 14673 minutes]:
github.com/eclipse/paho%2emqtt%2egolang.startComms.func1()
	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:407 +0x5c
created by github.com/eclipse/paho%2emqtt%2egolang.startComms in goroutine 68
	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:406 +0x1b4

goroutine 46 [chan receive, 14673 minutes]:
github.com/eclipse/paho%2emqtt%2egolang.startComms.func2()
	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:430 +0x64
created by github.com/eclipse/paho%2emqtt%2egolang.startComms in goroutine 68
	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:429 +0x214

goroutine 47 [semacquire, 14673 minutes]:
sync.runtime_Semacquire(0x0?)
	runtime/sema.go:71 +0x2c
sync.(*WaitGroup).Wait(0x4001076720)
	sync/waitgroup.go:118 +0x74
github.com/eclipse/paho%2emqtt%2egolang.startComms.func3()
	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:438 +0x28
created by github.com/eclipse/paho%2emqtt%2egolang.startComms in goroutine 68
	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:437 +0x264

goroutine 48 [select, 14673 minutes]:
github.com/eclipse/paho%2emqtt%2egolang.(*client).startCommsWorkers.func2()
	github.com/eclipse/paho.mqtt.golang@v1.5.0/client.go:662 +0xa8
created by github.com/eclipse/paho%2emqtt%2egolang.(*client).startCommsWorkers in goroutine 68
	github.com/eclipse/paho.mqtt.golang@v1.5.0/client.go:657 +0x7bc

goroutine 82 [chan receive]:
github.com/evcc-io/evcc/util/pipe.(*Dropper).pipe(...)
	github.com/evcc-io/evcc/util/pipe/limiter.go:78
created by github.com/evcc-io/evcc/util/pipe.(*Dropper).Pipe in goroutine 1
	github.com/evcc-io/evcc/util/pipe/limiter.go:90 +0x94

goroutine 83 [chan receive]:
github.com/evcc-io/evcc/util.(*Cache).Run(0x4001ad88a0, 0x4001aaf340)
	github.com/evcc-io/evcc/util/cache.go:39 +0x5c
created by github.com/evcc-io/evcc/cmd.runRoot in goroutine 1
	github.com/evcc-io/evcc/cmd/root.go:151 +0x464

goroutine 84 [chan receive]:
github.com/evcc-io/evcc/util/pipe.(*Dropper).pipe(...)
	github.com/evcc-io/evcc/util/pipe/limiter.go:78
created by github.com/evcc-io/evcc/util/pipe.(*Dropper).Pipe in goroutine 1
	github.com/evcc-io/evcc/util/pipe/limiter.go:90 +0x94

goroutine 85 [select]:
github.com/evcc-io/evcc/server.(*SocketHub).Run(0x40016b3530, 0x4001aaf3b0, 0x4001ad88a0)
	github.com/evcc-io/evcc/server/socket.go:147 +0x84
created by github.com/evcc-io/evcc/cmd.runRoot in goroutine 1
	github.com/evcc-io/evcc/cmd/root.go:168 +0x710

goroutine 92 [chan receive (nil chan), 14673 minutes]:
github.com/evcc-io/evcc/meter.NewTibberFromConfig.func3()
	github.com/evcc-io/evcc/meter/tibber-pulse.go:117 +0x40
created by github.com/evcc-io/evcc/meter.NewTibberFromConfig in goroutine 86
	github.com/evcc-io/evcc/meter/tibber-pulse.go:116 +0x8d8

goroutine 442 [chan receive, 33 minutes]:
github.com/evcc-io/evcc/tariff.(*Tibber).run(0x40018df1a0, 0x4001df3650)
	github.com/evcc-io/evcc/tariff/tibber.go:84 +0xd8
created by github.com/evcc-io/evcc/tariff.NewTibberFromConfig in goroutine 438
	github.com/evcc-io/evcc/tariff/tibber.go:71 +0x2e0

goroutine 117 [select]:
github.com/philippseith/signalr.(*loop).Run(0x400218c400, 0x400209c7e0)
	github.com/philippseith/signalr@v0.6.3/loop.go:77 +0xa9c
github.com/philippseith/signalr.(*client).run(0x40004ff400)
	github.com/philippseith/signalr@v0.6.3/client.go:245 +0x10c
github.com/philippseith/signalr.(*client).Start.func1()
	github.com/philippseith/signalr@v0.6.3/client.go:188 +0x108
created by github.com/philippseith/signalr.(*client).Start in goroutine 94
	github.com/philippseith/signalr@v0.6.3/client.go:171 +0x94

goroutine 115 [chan receive, 6152 minutes]:
github.com/evcc-io/evcc/charger.(*Easee).subscribe.func1()
	github.com/evcc-io/evcc/charger/easee.go:248 +0x4c
created by github.com/evcc-io/evcc/charger.(*Easee).subscribe in goroutine 94
	github.com/evcc-io/evcc/charger/easee.go:247 +0xb4

goroutine 458 [select]:
github.com/evcc-io/evcc/core.(*Site).Run(0x400204e400, 0x4002188b60, 0x37e11d600)
	github.com/evcc-io/evcc/core/site.go:1088 +0x20c
github.com/evcc-io/evcc/cmd.runRoot.func4()
	github.com/evcc-io/evcc/cmd/root.go:302 +0x2c
created by github.com/evcc-io/evcc/cmd.runRoot in goroutine 1
	github.com/evcc-io/evcc/cmd/root.go:301 +0x1adc

goroutine 2116403 [select]:
github.com/philippseith/signalr.ReadWriteWithContext({0x3118b30, 0x5574ee0}, 0x4002200750, 0x2d5d148)
	github.com/philippseith/signalr@v0.6.3/connectionbase.go:66 +0xf0
github.com/philippseith/signalr.(*webSocketConnection).Read(0x40021fdae0, {0x400184e000, 0x8000, 0x8000})
	github.com/philippseith/signalr@v0.6.3/websocketconnection.go:47 +0xac
github.com/philippseith/signalr.(*defaultHubConnection).Receive.func1({0x31192d8, 0x4000b86640}, {0xe61f0adafc18, 0x40021fdae0}, {0x30fed40, 0x4001d0a100}, 0x4002346d90, 0x400209c8c0)
	github.com/philippseith/signalr@v0.6.3/hubconnection.go:104 +0x98
created by github.com/philippseith/signalr.(*defaultHubConnection).Receive in goroutine 2116402
	github.com/philippseith/signalr@v0.6.3/hubconnection.go:97 +0x1cc

goroutine 3348276 [runnable]:
syscall.Syscall(0x3f, 0x6, 0x4000f173f1, 0x1)
	syscall/syscall_linux.go:73 +0x20
syscall.read(0x40000c2000?, {0x4000f173f1?, 0x5574e00?, 0xef02d0?})
	syscall/zsyscall_linux_arm64.go:736 +0x40
syscall.Read(...)
	syscall/syscall_unix.go:183
internal/poll.ignoringEINTRIO(...)
	internal/poll/fd_unix.go:745
internal/poll.(*FD).Read(0x40000c2000, {0x4000f173f1, 0x1, 0x1})
	internal/poll/fd_unix.go:161 +0x220
net.(*netFD).Read(0x40000c2000, {0x4000f173f1?, 0x4000854728?, 0x1f39558?})
	net/fd_posix.go:55 +0x28
net.(*conn).Read(0x4000132070, {0x4000f173f1?, 0x400063bb80?, 0x4001d90ec0?})
	net/net.go:189 +0x34
net/http.(*connReader).backgroundRead(0x4000f173e0)
	net/http/server.go:690 +0x40
created by net/http.(*connReader).startBackgroundRead in goroutine 3348275
	net/http/server.go:686 +0xc4

goroutine 422 [select]:
github.com/evcc-io/evcc/core.(*Site).Prepare.func2(0x0)
	github.com/evcc-io/evcc/core/site.go:1049 +0xb4
created by github.com/evcc-io/evcc/core.(*Site).Prepare in goroutine 1
	github.com/evcc-io/evcc/core/site.go:1047 +0x144

goroutine 475 [syscall, 14673 minutes]:
os/signal.signal_recv()
	runtime/sigqueue.go:152 +0x30
os/signal.loop()
	os/signal/signal_unix.go:23 +0x1c
created by os/signal.Notify.func1.1 in goroutine 488
	os/signal/signal.go:151 +0x28

goroutine 460 [IO wait]:
internal/poll.runtime_pollWait(0xe61f0ac36cb8, 0x72)
	runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x400229a980?, 0x2504c?, 0x0)
	internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x400229a980)
	internal/poll/fd_unix.go:620 +0x24c
net.(*netFD).accept(0x400229a980)
	net/fd_unix.go:172 +0x28
net.(*UnixListener).accept(0x4001767e28?)
	net/unixsock_posix.go:172 +0x20
net.(*UnixListener).Accept(0x400211ade0)
	net/unixsock.go:260 +0x2c
net/http.(*Server).Serve(0x40021a60f0, {0x3110a58, 0x400211ade0})
	net/http/server.go:3330 +0x294
github.com/evcc-io/evcc/server.HealthListener.func1()
	github.com/evcc-io/evcc/server/uds.go:39 +0x28
created by github.com/evcc-io/evcc/server.HealthListener in goroutine 459
	github.com/evcc-io/evcc/server/uds.go:39 +0x1b8

goroutine 482 [chan receive]:
github.com/evcc-io/evcc/util/pipe.(*Dropper).pipe(...)
	github.com/evcc-io/evcc/util/pipe/limiter.go:78
created by github.com/evcc-io/evcc/util/pipe.(*Dropper).Pipe in goroutine 1
	github.com/evcc-io/evcc/util/pipe/limiter.go:90 +0x94

goroutine 483 [chan receive]:
github.com/evcc-io/evcc/server.(*MQTT).Run(0x4001a64ea0, {0x3173d28, 0x400204e400}, 0x4002188a80)
	github.com/evcc-io/evcc/server/mqtt.go:286 +0x32c
created by github.com/evcc-io/evcc/cmd.runRoot in goroutine 1
	github.com/evcc-io/evcc/cmd/root.go:226 +0x1234

goroutine 484 [IO wait, 14672 minutes]:
internal/poll.runtime_pollWait(0xe61f0b1a5c00, 0x72)
	runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x400218c280?, 0x50?, 0x0)
	internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).RawRead(0x400218c280, 0x4000640a00)
	internal/poll/fd_unix.go:717 +0x12c
net.(*rawConn).Read(0x4000078ba0, 0x0?)
	net/rawconn.go:44 +0x30
golang.org/x/net/internal/socket.(*Conn).recvMsg(0x4000bed100, 0x4001ce9d40, 0x0)
	golang.org/x/net@v0.31.0/internal/socket/rawconn_msg.go:27 +0x138
golang.org/x/net/internal/socket.(*Conn).RecvMsg(...)
	golang.org/x/net@v0.31.0/internal/socket/socket.go:247
golang.org/x/net/ipv4.(*payloadHandler).ReadFrom(0x4001681140, {0x40020aa000, 0x10000, 0x10000})
	golang.org/x/net@v0.31.0/ipv4/payload_cmsg.go:31 +0x468
github.com/libp2p/zeroconf/v2.(*Server).recv4(0x40016fe8a0, 0x4001681130)
	github.com/libp2p/zeroconf/v2@v2.2.0/server.go:275 +0xb8
created by github.com/libp2p/zeroconf/v2.(*Server).start in goroutine 1
	github.com/libp2p/zeroconf/v2@v2.2.0/server.go:213 +0x90

goroutine 485 [IO wait, 14672 minutes]:
internal/poll.runtime_pollWait(0xe61f0b1a5ae8, 0x72)
	runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x400218c300?, 0x50?, 0x0)
	internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).RawRead(0x400218c300, 0x4000640a50)
	internal/poll/fd_unix.go:717 +0x12c
net.(*rawConn).Read(0x4000078bb0, 0x0?)
	net/rawconn.go:44 +0x30
golang.org/x/net/internal/socket.(*Conn).recvMsg(0x4000bed120, 0x4001ce9da0, 0x0)
	golang.org/x/net@v0.31.0/internal/socket/rawconn_msg.go:27 +0x138
golang.org/x/net/internal/socket.(*Conn).RecvMsg(...)
	golang.org/x/net@v0.31.0/internal/socket/socket.go:247
golang.org/x/net/ipv6.(*payloadHandler).ReadFrom(0x4001681190, {0x4001eca000, 0x10000, 0x10000})
	golang.org/x/net@v0.31.0/ipv6/payload_cmsg.go:31 +0x344
github.com/libp2p/zeroconf/v2.(*Server).recv6(0x40016fe8a0, 0x4001681180)
	github.com/libp2p/zeroconf/v2@v2.2.0/server.go:300 +0xb8
created by github.com/libp2p/zeroconf/v2.(*Server).start in goroutine 1
	github.com/libp2p/zeroconf/v2@v2.2.0/server.go:217 +0x108

goroutine 487 [chan receive, 4423 minutes]:
github.com/evcc-io/evcc/push.(*Hub).Run(0x4002181240, 0x4002185ea0, 0x400010ed20)
	github.com/evcc-io/evcc/push/hub.go:100 +0x5c
created by github.com/evcc-io/evcc/cmd.configureMessengers in goroutine 1
	github.com/evcc-io/evcc/cmd/setup.go:767 +0x358

goroutine 488 [chan receive, 14673 minutes]:
github.com/evcc-io/evcc/cmd.runRoot.func1()
	github.com/evcc-io/evcc/cmd/root.go:256 +0x90
created by github.com/evcc-io/evcc/cmd.runRoot in goroutine 1
	github.com/evcc-io/evcc/cmd/root.go:252 +0x15a4

goroutine 489 [chan receive, 14673 minutes]:
github.com/evcc-io/evcc/cmd.runRoot.func2()
	github.com/evcc-io/evcc/cmd/root.go:262 +0x2c
created by github.com/evcc-io/evcc/cmd.runRoot in goroutine 1
	github.com/evcc-io/evcc/cmd/root.go:261 +0x15f4

goroutine 490 [chan receive, 274 minutes]:
github.com/evcc-io/evcc/server/updater.Run(0x400059c7b0, {0x40015a7483?, 0x9?}, 0x400010ed20)
	github.com/evcc-io/evcc/server/updater/run.go:22 +0x198
created by github.com/evcc-io/evcc/cmd.runRoot in goroutine 1
	github.com/evcc-io/evcc/cmd/root.go:290 +0x1a50

goroutine 491 [select]:
github.com/smallnest/chanx.process[...]({0x3118b30, 0x5574ee0}, 0x40022a1ea0, 0x40022a1f10, 0x4000be5340)
	github.com/smallnest/chanx@v1.2.0/unbounded_chan.go:68 +0xdc
created by github.com/smallnest/chanx.NewUnboundedChanSize[...] in goroutine 1
	github.com/smallnest/chanx@v1.2.0/unbounded_chan.go:47 +0x1d0

goroutine 492 [chan receive]:
github.com/evcc-io/evcc/core.(*Site).Prepare.func1()
	github.com/evcc-io/evcc/core/site.go:1033 +0x78
created by github.com/evcc-io/evcc/core.(*Site).Prepare in goroutine 1
	github.com/evcc-io/evcc/core/site.go:1032 +0xd4

goroutine 494 [chan receive, 274 minutes]:
github.com/evcc-io/evcc/server/updater.(*watch).watchReleases(0x40016f5d58, {0x30ea2e8, 0x7}, 0x40022ac310)
	github.com/evcc-io/evcc/server/updater/watch.go:32 +0x54
created by github.com/evcc-io/evcc/server/updater.Run in goroutine 490
	github.com/evcc-io/evcc/server/updater/run.go:20 +0x12c

goroutine 531 [chan send]:
github.com/evcc-io/evcc/core.(*Site).loopLoadpoints(...)
	github.com/evcc-io/evcc/core/site.go:1068
created by github.com/evcc-io/evcc/core.(*Site).Run in goroutine 458
	github.com/evcc-io/evcc/core/site.go:1083 +0x178

goroutine 2116402 [select]:
github.com/philippseith/signalr.(*loop).Run.func1()
	github.com/philippseith/signalr@v0.6.3/loop.go:57 +0xa4
created by github.com/philippseith/signalr.(*loop).Run in goroutine 117
	github.com/philippseith/signalr@v0.6.3/loop.go:53 +0xc0

goroutine 7870 [select, 14644 minutes]:
google.golang.org/grpc/internal/grpcsync.(*CallbackSerializer).run(0x4001d06ab0, {0x31192d8, 0x4001d8e280})
	google.golang.org/grpc@v1.68.0/internal/grpcsync/callback_serializer.go:88 +0xc4
created by google.golang.org/grpc/internal/grpcsync.NewCallbackSerializer in goroutine 7868
	google.golang.org/grpc@v1.68.0/internal/grpcsync/callback_serializer.go:52 +0x118

goroutine 3348369 [IO wait]:
internal/poll.runtime_pollWait(0xe61f0ac36970, 0x72)
	runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x400013c800?, 0x400052a000?, 0x0)
	internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x400013c800, {0x400052a000, 0x2500, 0x2500})
	internal/poll/fd_unix.go:165 +0x1fc
net.(*netFD).Read(0x400013c800, {0x400052a000?, 0x4000f0efd0?, 0xd?})
	net/fd_posix.go:55 +0x28
net.(*conn).Read(0x4001d0a078, {0x400052a000?, 0x400052a005?, 0x0?})
	net/net.go:189 +0x34
crypto/tls.(*atLeastReader).Read(0x4001ebea08, {0x400052a000?, 0x0?, 0x4001ebea08?})
	crypto/tls/conn.go:809 +0x40
bytes.(*Buffer).ReadFrom(0x4000f0f0b8, {0x30fbc60, 0x4001ebea08})
	bytes/buffer.go:211 +0x90
crypto/tls.(*Conn).readFromUntil(0x4000f0ee08, {0x30f6920, 0x4001d0a078}, 0x0?)
	crypto/tls/conn.go:831 +0xd0
crypto/tls.(*Conn).readRecordOrCCS(0x4000f0ee08, 0x0)
	crypto/tls/conn.go:629 +0x35c
crypto/tls.(*Conn).readRecord(...)
	crypto/tls/conn.go:591
crypto/tls.(*Conn).Read(0x4000f0ee08, {0x4001646000, 0x1000, 0x4001ae5b20?})
	crypto/tls/conn.go:1385 +0x164
net/http.(*readWriteCloserBody).Read(0x0?, {0x4001646000?, 0x0?, 0x2?})
	net/http/transport.go:2504 +0xa8
bufio.(*Reader).fill(0x40010e6360)
	bufio/bufio.go:110 +0xf8
bufio.(*Reader).ReadByte(0x40010e6360)
	bufio/bufio.go:269 +0x2c
nhooyr.io/websocket.readFrameHeader(0x40010e6360, {0x4000fb4068, 0x8, 0x8})
	nhooyr.io/websocket@v1.8.17/frame.go:55 +0x68
nhooyr.io/websocket.(*Conn).readFrameHeader(0x4000fb4000, {0x3118b30, 0x5574ee0})
	nhooyr.io/websocket@v1.8.17/read.go:235 +0x94
nhooyr.io/websocket.(*Conn).readLoop(0x4000fb4000, {0x3118b30, 0x5574ee0})
	nhooyr.io/websocket@v1.8.17/read.go:193 +0x48
nhooyr.io/websocket.(*Conn).reader(0x4000fb4000, {0x3118b30, 0x5574ee0})
	nhooyr.io/websocket@v1.8.17/read.go:355 +0xbc
nhooyr.io/websocket.(*Conn).Reader(...)
	nhooyr.io/websocket@v1.8.17/read.go:39
nhooyr.io/websocket.(*Conn).Read(0x4000855758?, {0x3118b30?, 0x5574ee0?})
	nhooyr.io/websocket@v1.8.17/read.go:47 +0x28
github.com/philippseith/signalr.(*webSocketConnection).Read.func1()
	github.com/philippseith/signalr@v0.6.3/websocketconnection.go:49 +0x58
github.com/philippseith/signalr.ReadWriteWithContext.func1()
	github.com/philippseith/signalr@v0.6.3/connectionbase.go:62 +0x2c
created by github.com/philippseith/signalr.ReadWriteWithContext in goroutine 2116403
	github.com/philippseith/signalr@v0.6.3/connectionbase.go:61 +0xa0

goroutine 7869 [select, 14644 minutes]:
google.golang.org/grpc/internal/grpcsync.(*CallbackSerializer).run(0x4001d06990, {0x31192d8, 0x4001d8e230})
	google.golang.org/grpc@v1.68.0/internal/grpcsync/callback_serializer.go:88 +0xc4
created by google.golang.org/grpc/internal/grpcsync.NewCallbackSerializer in goroutine 7868
	google.golang.org/grpc@v1.68.0/internal/grpcsync/callback_serializer.go:52 +0x118

goroutine 2116346 [select]:
nhooyr.io/websocket.(*Conn).timeoutLoop(0x4000fb4000)
	nhooyr.io/websocket@v1.8.17/conn.go:180 +0x130
created by nhooyr.io/websocket.newConn in goroutine 117
	nhooyr.io/websocket@v1.8.17/conn.go:140 +0x484

goroutine 2116404 [select]:
github.com/philippseith/signalr.(*pipe).Read(0x4002346e00, {0x4001238000, 0x8000, 0x54e9320?})
	github.com/philippseith/signalr@v0.6.3/ctxpipe.go:53 +0xa8
github.com/philippseith/signalr.(*PipeReader).Read(0x40022006c0?, {0x4001238000?, 0x4001eb8a50?, 0x0?})
	github.com/philippseith/signalr@v0.6.3/ctxpipe.go:130 +0x24
github.com/philippseith/signalr.readJSONFrames({0x30fed60, 0x4001d0a0f8}, 0x4001eb8a50)
	github.com/philippseith/signalr@v0.6.3/jsonhubprotocol.go:162 +0x78
github.com/philippseith/signalr.(*jsonHubProtocol).ParseMessages(0x40012c24e0, {0x30fed60?, 0x4001d0a0f8?}, 0x0?)
	github.com/philippseith/signalr@v0.6.3/jsonhubprotocol.go:68 +0x38
github.com/philippseith/signalr.(*defaultHubConnection).Receive.func2({0x31192d8, 0x4000b86640}, {0x30fed60, 0x4001d0a0f8}, 0x4002346d90, 0x400209c8c0)
	github.com/philippseith/signalr@v0.6.3/hubconnection.go:138 +0xb4
created by github.com/philippseith/signalr.(*defaultHubConnection).Receive in goroutine 2116402
	github.com/philippseith/signalr@v0.6.3/hubconnection.go:128 +0x2b0

goroutine 3194046 [chan send, 1902 minutes]:
github.com/hasura/go-graphql-client.(*SubscriptionClient).Run.func1()
	github.com/hasura/go-graphql-client@v0.13.1/subscription.go:785 +0x328
created by github.com/hasura/go-graphql-client.(*SubscriptionClient).Run in goroutine 93
	github.com/hasura/go-graphql-client@v0.13.1/subscription.go:775 +0x180

@foto-andreas
Copy link
Contributor Author

curl http://localhost:7070/debug/pprof/goroutine?debug=1
goroutine profile: total 49
3 @ 0x85ee8 0x186a4 0x18274 0x1ff51b8 0x8e704
#	0x1ff51b7	github.com/evcc-io/evcc/util/pipe.(*Dropper).pipe+0x57	github.com/evcc-io/evcc/util/pipe/limiter.go:78

3 @ 0x85ee8 0x6167c 0x68c474 0x8e704
#	0x68c473	google.golang.org/grpc/internal/grpcsync.(*CallbackSerializer).run+0xc3	google.golang.org/grpc@v1.68.0/internal/grpcsync/callback_serializer.go:88

1 @ 0x1e1e4 0x883d0 0x5bc48c 0x8e704
#	0x883cf		os/signal.signal_recv+0x2f	runtime/sigqueue.go:152
#	0x5bc48b	os/signal.loop+0x1b		os/signal/signal_unix.go:23

1 @ 0x42e2c 0x84cf4 0x1fcbbc4 0x1fcb9e0 0x1fc8f78 0x1fddb84 0x1fde498 0x31f608 0x321274 0xf786a4 0x33dfac 0x31e1bc 0x8e704
#	0x1fcbbc3	runtime/pprof.writeRuntimeProfile+0xb3			runtime/pprof/pprof.go:793
#	0x1fcb9df	runtime/pprof.writeGoroutine+0x4f			runtime/pprof/pprof.go:752
#	0x1fc8f77	runtime/pprof.(*Profile).WriteTo+0x147			runtime/pprof/pprof.go:374
#	0x1fddb83	net/http/pprof.handler.ServeHTTP+0x443			net/http/pprof/pprof.go:272
#	0x1fde497	net/http/pprof.Index+0xc7				net/http/pprof/pprof.go:388
#	0x31f607	net/http.HandlerFunc.ServeHTTP+0x37			net/http/server.go:2220
#	0x321273	net/http.(*ServeMux).ServeHTTP+0x1b3			net/http/server.go:2747
#	0xf786a3	github.com/gorilla/mux.(*Router).ServeHTTP+0x193	github.com/gorilla/mux@v1.8.1/mux.go:212
#	0x33dfab	net/http.serverHandler.ServeHTTP+0xbb			net/http/server.go:3210
#	0x31e1bb	net/http.(*conn).serve+0x4fb				net/http/server.go:2092

1 @ 0x85ee8 0x17790 0x173c8 0x1facc78 0x8e704
#	0x1facc77	github.com/evcc-io/evcc/core.(*Site).loopLoadpoints+0x67	github.com/evcc-io/evcc/core/site.go:1068

1 @ 0x85ee8 0x17790 0x173c8 0xe74ff8 0x8e704
#	0xe74ff7	github.com/hasura/go-graphql-client.(*SubscriptionClient).Run.func1+0x327	github.com/hasura/go-graphql-client@v0.13.1/subscription.go:785

1 @ 0x85ee8 0x1842c 0x18254 0xea0cf0 0x8e704
#	0xea0cef	github.com/evcc-io/evcc/meter.NewTibberFromConfig.func3+0x3f	github.com/evcc-io/evcc/meter/tibber-pulse.go:117

1 @ 0x85ee8 0x186a4 0x18254 0x1f80ec8 0x8e704
#	0x1f80ec7	github.com/evcc-io/evcc/tariff.(*Tibber).run+0xd7	github.com/evcc-io/evcc/tariff/tibber.go:84

1 @ 0x85ee8 0x186a4 0x18254 0x1ff44d4 0x8e704
#	0x1ff44d3	github.com/evcc-io/evcc/server/updater.(*watch).watchReleases+0x53	github.com/evcc-io/evcc/server/updater/watch.go:32

1 @ 0x85ee8 0x186a4 0x18254 0x209253c 0x8e704
#	0x209253b	github.com/evcc-io/evcc/cmd.runRoot.func2+0x2b	github.com/evcc-io/evcc/cmd/root.go:262

1 @ 0x85ee8 0x186a4 0x18254 0x2092660 0x8e704
#	0x209265f	github.com/evcc-io/evcc/cmd.runRoot.func1+0x8f	github.com/evcc-io/evcc/cmd/root.go:256

1 @ 0x85ee8 0x186a4 0x18274 0x1016a4c 0x8e704
#	0x1016a4b	github.com/evcc-io/evcc/charger.(*Easee).subscribe.func1+0x4b	github.com/evcc-io/evcc/charger/easee.go:248

1 @ 0x85ee8 0x186a4 0x18274 0x1e244ac 0x8e704
#	0x1e244ab	github.com/evcc-io/evcc/push.(*Hub).Run+0x5b	github.com/evcc-io/evcc/push/hub.go:100

1 @ 0x85ee8 0x186a4 0x18274 0x1effb8c 0x8e704
#	0x1effb8b	github.com/evcc-io/evcc/server.(*MQTT).Run+0x32b	github.com/evcc-io/evcc/server/mqtt.go:286

1 @ 0x85ee8 0x186a4 0x18274 0x1fac8b8 0x8e704
#	0x1fac8b7	github.com/evcc-io/evcc/core.(*Site).Prepare.func1+0x77	github.com/evcc-io/evcc/core/site.go:1033

1 @ 0x85ee8 0x186a4 0x18274 0x1ff43c8 0x8e704
#	0x1ff43c7	github.com/evcc-io/evcc/server/updater.Run+0x197	github.com/evcc-io/evcc/server/updater/run.go:22

1 @ 0x85ee8 0x186a4 0x18274 0x2096c84 0x8e704
#	0x2096c83	github.com/evcc-io/evcc/cmd.configureDatabase.func2+0x63	github.com/evcc-io/evcc/cmd/setup.go:541

1 @ 0x85ee8 0x186a4 0x18274 0x4d99dc 0x8e704
#	0x4d99db	github.com/evcc-io/evcc/util.(*Cache).Run+0x5b	github.com/evcc-io/evcc/util/cache.go:39

1 @ 0x85ee8 0x186a4 0x18274 0x4ddee4 0x8e704
#	0x4ddee3	github.com/evcc-io/evcc/util.(*Tee).Run+0x43	github.com/evcc-io/evcc/util/tee.go:39

1 @ 0x85ee8 0x186a4 0x18274 0x9e0544 0x8e704
#	0x9e0543	github.com/eclipse/paho%2emqtt%2egolang.startComms.func2+0x63	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:430

1 @ 0x85ee8 0x186a4 0x18274 0x9e05cc 0x8e704
#	0x9e05cb	github.com/eclipse/paho%2emqtt%2egolang.startComms.func1+0x5b	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:407

1 @ 0x85ee8 0x186a4 0x18274 0x9e31a4 0x8e704
#	0x9e31a3	github.com/eclipse/paho%2emqtt%2egolang.(*router).matchAndDispatch.func2+0x83	github.com/eclipse/paho.mqtt.golang@v1.5.0/router.go:174

1 @ 0x85ee8 0x46b38 0x85190 0x110288 0x1115ac 0x11159d 0x1f61f8 0x208484 0x261480 0x14a7e0 0x261660 0x25edfc 0x264d94 0x264d95 0x339898 0x29fe58 0x2a061c 0xef7688 0xef9384 0xef8e78 0xef9ecc 0xef8428 0xef8421 0xf291a8 0xf1aa3c 0x8e704
#	0x8518f		internal/poll.runtime_pollWait+0x9f					runtime/netpoll.go:351
#	0x110287	internal/poll.(*pollDesc).wait+0x27					internal/poll/fd_poll_runtime.go:84
#	0x1115ab	internal/poll.(*pollDesc).waitRead+0x1fb				internal/poll/fd_poll_runtime.go:89
#	0x11159c	internal/poll.(*FD).Read+0x1ec						internal/poll/fd_unix.go:165
#	0x1f61f7	net.(*netFD).Read+0x27							net/fd_posix.go:55
#	0x208483	net.(*conn).Read+0x33							net/net.go:189
#	0x26147f	crypto/tls.(*atLeastReader).Read+0x3f					crypto/tls/conn.go:809
#	0x14a7df	bytes.(*Buffer).ReadFrom+0x8f						bytes/buffer.go:211
#	0x26165f	crypto/tls.(*Conn).readFromUntil+0xcf					crypto/tls/conn.go:831
#	0x25edfb	crypto/tls.(*Conn).readRecordOrCCS+0x35b				crypto/tls/conn.go:629
#	0x264d93	crypto/tls.(*Conn).readRecord+0x163					crypto/tls/conn.go:591
#	0x264d94	crypto/tls.(*Conn).Read+0x164						crypto/tls/conn.go:1385
#	0x339897	net/http.(*readWriteCloserBody).Read+0xa7				net/http/transport.go:2504
#	0x29fe57	bufio.(*Reader).fill+0xf7						bufio/bufio.go:110
#	0x2a061b	bufio.(*Reader).ReadByte+0x2b						bufio/bufio.go:269
#	0xef7687	nhooyr.io/websocket.readFrameHeader+0x67				nhooyr.io/websocket@v1.8.17/frame.go:55
#	0xef9383	nhooyr.io/websocket.(*Conn).readFrameHeader+0x93			nhooyr.io/websocket@v1.8.17/read.go:235
#	0xef8e77	nhooyr.io/websocket.(*Conn).readLoop+0x47				nhooyr.io/websocket@v1.8.17/read.go:193
#	0xef9ecb	nhooyr.io/websocket.(*Conn).reader+0xbb					nhooyr.io/websocket@v1.8.17/read.go:355
#	0xef8427	nhooyr.io/websocket.(*Conn).Reader+0x27					nhooyr.io/websocket@v1.8.17/read.go:39
#	0xef8420	nhooyr.io/websocket.(*Conn).Read+0x20					nhooyr.io/websocket@v1.8.17/read.go:47
#	0xf291a7	github.com/philippseith/signalr.(*webSocketConnection).Read.func1+0x57	github.com/philippseith/signalr@v0.6.3/websocketconnection.go:49
#	0xf1aa3b	github.com/philippseith/signalr.ReadWriteWithContext.func1+0x2b		github.com/philippseith/signalr@v0.6.3/connectionbase.go:62

1 @ 0x85ee8 0x46b38 0x85190 0x110288 0x1115ac 0x11159d 0x1f61f8 0x208484 0x261480 0x14a7e0 0x261660 0x25edfc 0x264d94 0x264d95 0x9e010 0x9bb95c 0x9bb93d 0x9ddf5c 0x8e704
#	0x8518f		internal/poll.runtime_pollWait+0x9f					runtime/netpoll.go:351
#	0x110287	internal/poll.(*pollDesc).wait+0x27					internal/poll/fd_poll_runtime.go:84
#	0x1115ab	internal/poll.(*pollDesc).waitRead+0x1fb				internal/poll/fd_poll_runtime.go:89
#	0x11159c	internal/poll.(*FD).Read+0x1ec						internal/poll/fd_unix.go:165
#	0x1f61f7	net.(*netFD).Read+0x27							net/fd_posix.go:55
#	0x208483	net.(*conn).Read+0x33							net/net.go:189
#	0x26147f	crypto/tls.(*atLeastReader).Read+0x3f					crypto/tls/conn.go:809
#	0x14a7df	bytes.(*Buffer).ReadFrom+0x8f						bytes/buffer.go:211
#	0x26165f	crypto/tls.(*Conn).readFromUntil+0xcf					crypto/tls/conn.go:831
#	0x25edfb	crypto/tls.(*Conn).readRecordOrCCS+0x35b				crypto/tls/conn.go:629
#	0x264d93	crypto/tls.(*Conn).readRecord+0x163					crypto/tls/conn.go:591
#	0x264d94	crypto/tls.(*Conn).Read+0x164						crypto/tls/conn.go:1385
#	0x9e00f		io.ReadAtLeast+0x9f							io/io.go:335
#	0x9bb95b	io.ReadFull+0x5b							io/io.go:354
#	0x9bb93c	github.com/eclipse/paho.mqtt.golang/packets.ReadPacket+0x3c		github.com/eclipse/paho.mqtt.golang@v1.5.0/packets/packets.go:131
#	0x9ddf5b	github.com/eclipse/paho%2emqtt%2egolang.startIncoming.func1+0xdb	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:124

1 @ 0x85ee8 0x46b38 0x85190 0x110288 0x114d4c 0x114d3d 0x1f7dc8 0x212478 0x21167c 0x322d24 0x322a54 0x2092104 0x5af14c 0x5af924 0x20903d4 0x20903c9 0x20a4098 0x4df48 0x8e704
#	0x8518f		internal/poll.runtime_pollWait+0x9f			runtime/netpoll.go:351
#	0x110287	internal/poll.(*pollDesc).wait+0x27			internal/poll/fd_poll_runtime.go:84
#	0x114d4b	internal/poll.(*pollDesc).waitRead+0x24b		internal/poll/fd_poll_runtime.go:89
#	0x114d3c	internal/poll.(*FD).Accept+0x23c			internal/poll/fd_unix.go:620
#	0x1f7dc7	net.(*netFD).accept+0x27				net/fd_unix.go:172
#	0x212477	net.(*TCPListener).accept+0x27				net/tcpsock_posix.go:159
#	0x21167b	net.(*TCPListener).Accept+0x2b				net/tcpsock.go:372
#	0x322d23	net/http.(*Server).Serve+0x293				net/http/server.go:3330
#	0x322a53	net/http.(*Server).ListenAndServe+0x83			net/http/server.go:3259
#	0x2092103	github.com/evcc-io/evcc/cmd.runRoot+0x1cd3		github.com/evcc-io/evcc/cmd/root.go:324
#	0x5af14b	github.com/spf13/cobra.(*Command).execute+0x81b		github.com/spf13/cobra@v1.8.1/command.go:989
#	0x5af923	github.com/spf13/cobra.(*Command).ExecuteC+0x343	github.com/spf13/cobra@v1.8.1/command.go:1117
#	0x20903d3	github.com/spf13/cobra.(*Command).Execute+0x23		github.com/spf13/cobra@v1.8.1/command.go:1041
#	0x20903c8	github.com/evcc-io/evcc/cmd.Execute+0x18		github.com/evcc-io/evcc/cmd/root.go:112
#	0x20a4097	main.main+0x37						github.com/evcc-io/evcc/main.go:44
#	0x4df47		runtime.main+0x287					runtime/proc.go:272

1 @ 0x85ee8 0x46b38 0x85190 0x110288 0x114d4c 0x114d3d 0x1f7dc8 0x219290 0x217e7c 0x322d24 0x1f01a28 0x8e704
#	0x8518f		internal/poll.runtime_pollWait+0x9f				runtime/netpoll.go:351
#	0x110287	internal/poll.(*pollDesc).wait+0x27				internal/poll/fd_poll_runtime.go:84
#	0x114d4b	internal/poll.(*pollDesc).waitRead+0x24b			internal/poll/fd_poll_runtime.go:89
#	0x114d3c	internal/poll.(*FD).Accept+0x23c				internal/poll/fd_unix.go:620
#	0x1f7dc7	net.(*netFD).accept+0x27					net/fd_unix.go:172
#	0x21928f	net.(*UnixListener).accept+0x1f					net/unixsock_posix.go:172
#	0x217e7b	net.(*UnixListener).Accept+0x2b					net/unixsock.go:260
#	0x322d23	net/http.(*Server).Serve+0x293					net/http/server.go:3330
#	0x1f01a27	github.com/evcc-io/evcc/server.HealthListener.func1+0x27	github.com/evcc-io/evcc/server/uds.go:39

1 @ 0x85ee8 0x46b38 0x85190 0x110288 0x1159ec 0x1159d9 0x20cb10 0xc3c3c8 0xc46e08 0xc46df9 0x2081978 0x8e704
#	0x8518f		internal/poll.runtime_pollWait+0x9f			runtime/netpoll.go:351
#	0x110287	internal/poll.(*pollDesc).wait+0x27			internal/poll/fd_poll_runtime.go:84
#	0x1159eb	internal/poll.(*pollDesc).waitRead+0x12b		internal/poll/fd_poll_runtime.go:89
#	0x1159d8	internal/poll.(*FD).RawRead+0x118			internal/poll/fd_unix.go:717
#	0x20cb0f	net.(*rawConn).Read+0x2f				net/rawconn.go:44
#	0xc3c3c7	golang.org/x/net/internal/socket.(*Conn).recvMsg+0x137	golang.org/x/net@v0.31.0/internal/socket/rawconn_msg.go:27
#	0xc46e07	golang.org/x/net/internal/socket.(*Conn).RecvMsg+0x467	golang.org/x/net@v0.31.0/internal/socket/socket.go:247
#	0xc46df8	golang.org/x/net/ipv4.(*payloadHandler).ReadFrom+0x458	golang.org/x/net@v0.31.0/ipv4/payload_cmsg.go:31
#	0x2081977	github.com/libp2p/zeroconf/v2.(*Server).recv4+0xb7	github.com/libp2p/zeroconf/v2@v2.2.0/server.go:275

1 @ 0x85ee8 0x46b38 0x85190 0x110288 0x1159ec 0x1159d9 0x20cb10 0xc3c3c8 0xc4f354 0xc4f349 0x2081b28 0x8e704
#	0x8518f		internal/poll.runtime_pollWait+0x9f			runtime/netpoll.go:351
#	0x110287	internal/poll.(*pollDesc).wait+0x27			internal/poll/fd_poll_runtime.go:84
#	0x1159eb	internal/poll.(*pollDesc).waitRead+0x12b		internal/poll/fd_poll_runtime.go:89
#	0x1159d8	internal/poll.(*FD).RawRead+0x118			internal/poll/fd_unix.go:717
#	0x20cb0f	net.(*rawConn).Read+0x2f				net/rawconn.go:44
#	0xc3c3c7	golang.org/x/net/internal/socket.(*Conn).recvMsg+0x137	golang.org/x/net@v0.31.0/internal/socket/rawconn_msg.go:27
#	0xc4f353	golang.org/x/net/internal/socket.(*Conn).RecvMsg+0x343	golang.org/x/net@v0.31.0/internal/socket/socket.go:247
#	0xc4f348	golang.org/x/net/ipv6.(*payloadHandler).ReadFrom+0x338	golang.org/x/net@v0.31.0/ipv6/payload_cmsg.go:31
#	0x2081b27	github.com/libp2p/zeroconf/v2.(*Server).recv6+0xb7	github.com/libp2p/zeroconf/v2@v2.2.0/server.go:300

1 @ 0x85ee8 0x6167c 0x1f00db4 0x8e704
#	0x1f00db3	github.com/evcc-io/evcc/server.(*SocketHub).Run+0x83	github.com/evcc-io/evcc/server/socket.go:147

1 @ 0x85ee8 0x6167c 0x1fac7c4 0x8e704
#	0x1fac7c3	github.com/evcc-io/evcc/core.(*Site).Prepare.func2+0xb3	github.com/evcc-io/evcc/core/site.go:1049

1 @ 0x85ee8 0x6167c 0x1facb8c 0x209239c 0x8e704
#	0x1facb8b	github.com/evcc-io/evcc/core.(*Site).Run+0x20b	github.com/evcc-io/evcc/core/site.go:1088
#	0x209239b	github.com/evcc-io/evcc/cmd.runRoot.func4+0x2b	github.com/evcc-io/evcc/cmd/root.go:302

1 @ 0x85ee8 0x6167c 0x1fb303c 0x8e704
#	0x1fb303b	github.com/smallnest/chanx.process[...]+0xdb	github.com/smallnest/chanx@v1.2.0/unbounded_chan.go:68

1 @ 0x85ee8 0x6167c 0x4fa980 0x8e704
#	0x4fa97f	database/sql.(*DB).connectionOpener+0x7f	database/sql/sql.go:1253

1 @ 0x85ee8 0x6167c 0x9d64f8 0x8e704
#	0x9d64f7	github.com/eclipse/paho%2emqtt%2egolang.(*client).startCommsWorkers.func2+0xa7	github.com/eclipse/paho.mqtt.golang@v1.5.0/client.go:662

1 @ 0x85ee8 0x6167c 0x9d6910 0x8e704
#	0x9d690f	github.com/eclipse/paho%2emqtt%2egolang.(*client).startCommsWorkers.func1+0xef	github.com/eclipse/paho.mqtt.golang@v1.5.0/client.go:627

1 @ 0x85ee8 0x6167c 0x9de3b4 0x8e704
#	0x9de3b3	github.com/eclipse/paho%2emqtt%2egolang.startIncomingComms.func1+0x133	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:175

1 @ 0x85ee8 0x6167c 0x9df4f4 0x8e704
#	0x9df4f3	github.com/eclipse/paho%2emqtt%2egolang.startOutgoingComms.func1+0x163	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:277

1 @ 0x85ee8 0x6167c 0x9e1da0 0x8e704
#	0x9e1d9f	github.com/eclipse/paho%2emqtt%2egolang.keepalive+0x18f	github.com/eclipse/paho.mqtt.golang@v1.5.0/ping.go:48

1 @ 0x85ee8 0x6167c 0xef4680 0x8e704
#	0xef467f	nhooyr.io/websocket.(*Conn).timeoutLoop+0x12f	nhooyr.io/websocket@v1.8.17/conn.go:180

1 @ 0x85ee8 0x6167c 0xf1a960 0xf2908c 0xf1d4b8 0x8e704
#	0xf1a95f	github.com/philippseith/signalr.ReadWriteWithContext+0xef			github.com/philippseith/signalr@v0.6.3/connectionbase.go:66
#	0xf2908b	github.com/philippseith/signalr.(*webSocketConnection).Read+0xab		github.com/philippseith/signalr@v0.6.3/websocketconnection.go:47
#	0xf1d4b7	github.com/philippseith/signalr.(*defaultHubConnection).Receive.func1+0x97	github.com/philippseith/signalr@v0.6.3/hubconnection.go:104

1 @ 0x85ee8 0x6167c 0xf1aeb8 0xf1b604 0xf20068 0xf1f588 0xf1d254 0x8e704
#	0xf1aeb7	github.com/philippseith/signalr.(*pipe).Read+0xa7				github.com/philippseith/signalr@v0.6.3/ctxpipe.go:53
#	0xf1b603	github.com/philippseith/signalr.(*PipeReader).Read+0x23				github.com/philippseith/signalr@v0.6.3/ctxpipe.go:130
#	0xf20067	github.com/philippseith/signalr.readJSONFrames+0x77				github.com/philippseith/signalr@v0.6.3/jsonhubprotocol.go:162
#	0xf1f587	github.com/philippseith/signalr.(*jsonHubProtocol).ParseMessages+0x37		github.com/philippseith/signalr@v0.6.3/jsonhubprotocol.go:68
#	0xf1d253	github.com/philippseith/signalr.(*defaultHubConnection).Receive.func2+0xb3	github.com/philippseith/signalr@v0.6.3/hubconnection.go:138

1 @ 0x85ee8 0x6167c 0xf2164c 0xf15c5c 0xf157f8 0x8e704
#	0xf2164b	github.com/philippseith/signalr.(*loop).Run+0xa9b		github.com/philippseith/signalr@v0.6.3/loop.go:77
#	0xf15c5b	github.com/philippseith/signalr.(*client).run+0x10b		github.com/philippseith/signalr@v0.6.3/client.go:245
#	0xf157f7	github.com/philippseith/signalr.(*client).Start.func1+0x107	github.com/philippseith/signalr@v0.6.3/client.go:188

1 @ 0x85ee8 0x6167c 0xf218c4 0x8e704
#	0xf218c3	github.com/philippseith/signalr.(*loop).Run.func1+0xa3	github.com/philippseith/signalr@v0.6.3/loop.go:57

1 @ 0x85ee8 0x62518 0x624f5 0x8781c 0x9d4c4 0x9e0448 0x8e704
#	0x8781b		sync.runtime_Semacquire+0x2b					runtime/sema.go:71
#	0x9d4c3		sync.(*WaitGroup).Wait+0x73					sync/waitgroup.go:118
#	0x9e0447	github.com/eclipse/paho%2emqtt%2egolang.startComms.func3+0x27	github.com/eclipse/paho.mqtt.golang@v1.5.0/net.go:438

1 @ 0x8e704

@andig
Copy link
Member

andig commented Jan 8, 2025

Ich sehe da leider nichts Auffälliges. Offensichtlich laufen Tibber Tarif, Tibber Pulse und eine Easee. Kein Deadlock erkennbar.

/cc @GrimmiMeloni siehst Du evtl. was?

@foto-andreas
Copy link
Contributor Author

Es gibt da noch die deadlock-Debug-Ausgabe, die ist binär. Hilft die?

@foto-andreas
Copy link
Contributor Author

Ich geh nochmal auf die Suche, ob irgendwas anderes in Container oder drumrum Probleme bereitet. Wenn kein anderer mit diesem Problem konfrontiert ist, sieht das doch irgendwie nach Grund außerhalb von evcc aus.

@foto-andreas
Copy link
Contributor Author

  • Traefik drumrum neu gestartet, allerdings ohne Erfolg
  • Beide EVCC-Container werden in "docker ps" als unhealthy angezeigt
  • curl auf die UI von innerhalb des Containers gibt einen 200 und liefert HTML-Code
  • curl direkt von außen auf den Container geht nicht, da der Port nicht exportiert wird, das macht traefik
  • Port über iptables zu exportieren will auch nicht - bekomme ich weiterhin ein failed to connect

@andig
Copy link
Member

andig commented Jan 9, 2025

curl auf die UI von innerhalb des Containers gibt einen 200 und liefert HTML-Code

Also ist das UI doch erreichbar!

@foto-andreas
Copy link
Contributor Author

Jepp. Aber nur innerhalb des Containers selbst. Ich habe jetzt einen der Docker-Container gestoppt und wieder gestartet. Das repariert die Situation für diesen Container. Offenbar liegt das Problem also irgendwo in der Docker-Schicht des Containers. Wenn da noch jemand eine Idee zu hat, gerne, ansonsten schau ich mal die Health-Bedingung an und/oder baue was außen drumrum, was den Container alle 15 Minuten neu startet, solange Tibber nicht erreichbar ist. Danke erstmal für das ganze Debug-Kram!

@GrimmiMeloni
Copy link
Collaborator

Ich sehe auf die Schnelle erstmal nichts auffälliges.

Nur zur Sicherheit gefragt: Die Logs von evcc laufen weiter durch, d.h. in jedem Intervall werden Logs geschrieben? Wenn ja, kannst Du mal ein Trace Log erzeugen?

Ansonsten noch als Idee: Vielleicht ist der Tibber auch ein Red Herring und nur ein Seiteneffekt von einer generellen Connectivity Problematik. Wie sieht denn der Netzwerk Stack da ansonsten aus? Traeffik als Reverse Proxy davor hab ich mal aus dem Kontext gelesen. Wie ist der Container denn vernetzt? Bridge, Host?

@foto-andreas
Copy link
Contributor Author

Also, Netzwerktechnisch haben die Docker-Compose-Files folgenden Eintrag

networks:
  traefik:
    external: true

Das Netzwerk dazu:

[
    {
        "Name": "traefik",
        "Id": "97197967bfae01a49f4f67f5cbc1099dd92a510d16df45b5013b4a9639a00d9a",
        "Created": "2024-11-14T20:25:28.159153888+01:00",
        "Scope": "local",
        "Driver": "bridge",
        "EnableIPv6": true,
        "IPAM": {
            "Driver": "default",
            "Options": {},
            "Config": [
                {
                    "Subnet": "172.18.0.0/16",
                    "Gateway": "172.18.0.1"
                },
                {
                    "Subnet": "fdd1:c4ab:6c19::/64",
                    "Gateway": "fdd1:c4ab:6c19::1"
                }
            ]
        },
        "Internal": false,
        "Attachable": false,
        "Ingress": false,
        "ConfigFrom": {
            "Network": ""
        },
        "ConfigOnly": false,
        "Containers": {
            "887998dba732b8341c38ea4ecc2fbe37d775f3bd6dc0bee68172bf39114ba3dc": {
                "Name": "evcc-py",
                "EndpointID": "847965ec3cb587cb60db4749cd654bf65db9d699c2260cca5e36185820e719ab",
                "MacAddress": "02:42:ac:12:00:04",
                "IPv4Address": "172.18.0.4/16",
                "IPv6Address": "fdd1:c4ab:6c19::4/64"
            },
            "ce21c690101f134cd2c46e77393a2159d52f6a65365b7e2b7220fdf121fedb0a": {
                "Name": "evcc-wk",
                "EndpointID": "9a03a680e4267ce5cb227819e26ea3489949f02d852fecc53f7d926c95019283",
                "MacAddress": "02:42:ac:12:00:03",
                "IPv4Address": "172.18.0.3/16",
                "IPv6Address": "fdd1:c4ab:6c19::3/64"
            },
            "ecaa1e6e26e73df8295f25f6ef91de7960b2d6f438431da7d3589f209a053c57": {
                "Name": "traefik",
                "EndpointID": "14bd0648f69d5451be50326e10a25f1292d24f7ccacebbb0a81de03b870bcd04",
                "MacAddress": "02:42:ac:12:00:02",
                "IPv4Address": "172.18.0.2/16",
                "IPv6Address": "fdd1:c4ab:6c19::2/64"
            }
        },
        "Options": {},
        "Labels": {}
    }
]

Die EVCC-Container exportieren die Ports nicht, da das die einfachste Möglichkeit ist, sie nicht ins Internet zu exportieren.

Traefik dient als Reverse-Proxy und greift über seine Docker-Integration direkt in die Container, sorgt für eine Authentifizierung, SSL usw. und exportiert die EVCC-UI dann über Hostnamen ins Internet.

Die Logs laufen regelmäßig weiter und liefern konsequent wie oben beschrieben den "grid power: outdated"-Block.

Gerne weiter nachfragen. Ich stelle jetzt mal alles auf trace, muss dann aber warten, bis das Problem wieder auftritt, bis weitere Trace-Ausgaben da sind.

@GrimmiMeloni
Copy link
Collaborator

Die EVCC-Container exportieren die Ports nicht, da das die einfachste Möglichkeit ist, sie nicht ins Internet zu exportieren.

Ist das so? Muss nicht zumindest ein expose her?

Und steht nicht nur der Traeffik mit einem Bein im Internet?

@foto-andreas
Copy link
Contributor Author

Ja, das ist so. Alles ist im Docker-Netzwerk unterwegs und funktioniert ja auch sonst. Setzen wir so bei diversen Anwendungen ein.

Den zweiten Absatz hab ich nicht verstanden.

@foto-andreas
Copy link
Contributor Author

Irgendwas muss bei dem Problem hier ja aber auch innerhalb des Containers "sichtbar" sein, denn evcc fühlt sich ja unhealthy.

@foto-andreas
Copy link
Contributor Author

foto-andreas commented Jan 10, 2025

Hab nun ein Trace zum Zeitpunkt des Ausfalls:

evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:49:52 !! pvScalePhases EXIT
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:49:52 pv charge current: 0A = 0A + -1.86A (427W @ 1p)
evcc-wk  | [cache ] TRACE 2025/01/10 15:49:52 lp-1/planProjectedStart: 0001-01-01 00:00:00 +0000 UTC
evcc-wk  | [cache ] TRACE 2025/01/10 15:49:52 lp-1/planProjectedEnd: 0001-01-01 00:00:00 +0000 UTC
evcc-wk  | [cache ] TRACE 2025/01/10 15:49:52 lp-1/planOverrun: 0s
evcc-wk  | [cache ] TRACE 2025/01/10 15:49:52 greenShareHome: 0.016369273863987135
evcc-wk  | [cache ] TRACE 2025/01/10 15:49:52 greenShareLoadpoints: 0
evcc-wk  | [cache ] TRACE 2025/01/10 15:49:52 tariffPriceHome: 0.35651557131207584
evcc-wk  | [cache ] TRACE 2025/01/10 15:49:52 tariffPriceLoadpoints: 0.3604
evcc-wk  | [pulse ] TRACE 2025/01/10 15:49:52 {"type":"connection_init","payload":{"token":"***"}} client
evcc-wk  | [pulse ] TRACE 2025/01/10 15:49:52 {"type":"connection_init","payload":{"token":"***"}} client
evcc-wk  | [pulse ] TRACE 2025/01/10 15:49:52 {"type":"connection_init","payload":{"token":"***"}} client
evcc-wk  | [pulse ] TRACE 2025/01/10 15:49:52 {"type":"connection_ack"} server
evcc-wk  | [pulse ] TRACE 2025/01/10 15:49:52 {"id":"5e6ad0f4-fc6d-4ee9-8a3e-xxxx","type":"subscribe","payload":{"query":"subscription ($homeId:ID!){liveMeasurement(homeId: $homeId){power,powerProduction,lastMeterConsumption,las>
evcc-wk  | [pulse ] TRACE 2025/01/10 15:49:52 {"id":"5e6ad0f4-fc6d-4ee9-8a3e-xxxx","type":"next","payload":{"errors":[{"message":"unable to start stream api:a7a10770336a4d0198ecea09c28da682 for device a7a10770-336a-4d01-98ec-ea0>
evcc-wk  | [pulse ] ERROR 2025/01/10 15:49:52 Message: unable to start stream api:a7a10770336a4d0198ecea09c28da682 for device a7a10770-336a-4d01-98ec-xxxx; userId 29fbf633-dee7-4959-9d19-xxxx, Locations: [{Line:1 Column:>
evcc-wk  | [pulse ] TRACE 2025/01/10 15:49:52 {"id":"5e6ad0f4-fc6d-4ee9-8a3e-xxxx","type":"complete"} server
evcc-wk  | [pulse ] TRACE 2025/01/10 15:49:52 no running subscription. exiting... client
evcc-wk  | [pulse ] TRACE 2025/01/10 15:49:52 {"type":"connection_init","payload":{"token":"***"}} client
evcc-wk  | [pulse ] TRACE 2025/01/10 15:49:52 {"type":"connection_ack"} server
evcc-wk  | [pulse ] TRACE 2025/01/10 15:49:52 no running subscription. exiting... client
evcc-wk  | [site  ] DEBUG 2025/01/10 15:50:07 ----

Und dann, wenn "outdated" erkannt wird:

evcc-wk  | [site  ] DEBUG 2025/01/10 15:50:22 ----
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:50:22 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:50:22 charge currents: [0.007 0.007 0.009]A
evcc-wk  | [site  ] DEBUG 2025/01/10 15:50:22 grid power: 427W
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/chargePower: 0
evcc-wk  | [site  ] DEBUG 2025/01/10 15:50:22 grid currents: [0 0 0]A
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/chargeCurrents: [0.007000000216066837 0.007000000216066837 0.008999999612569809]
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 batteryGridChargeActive: false
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 gridPower: 427
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 gridCurrents: [0 0 0]
evcc-wk  | [sunspec] TRACE 2025/01/10 15:50:22 modbus: send 11 5f 00 00 00 06 01 03 9c 87 00 32
evcc-wk  | [sunspec] TRACE 2025/01/10 15:50:22 modbus: recv 11 5f 00 00 00 67 01 03 64 00 d1 00 38 00 4f 00 49 ff fe 0f b8 0f b8 0f b7 09 1a 09 1b 09 1d ff ff 4e 20 ff fc 13 85 ff fe 13 06 ff ff ec fa ff ff f2 04 ff fc 03 96 02 60 00 00>
evcc-wk  | [sunspec] TRACE 2025/01/10 15:50:22 model 103 block 0 point DCW: 2.0304
evcc-wk  | [pv1   ] TRACE 2025/01/10 15:50:22 modbus: send 11 60 00 00 00 06 01 03 f5 74 00 02
evcc-wk  | [pv1   ] TRACE 2025/01/10 15:50:22 modbus: recv 11 60 00 00 00 07 01 03 04 00 00 00 00
evcc-wk  | [site  ] DEBUG 2025/01/10 15:50:22 pv 1 power: 2W
evcc-wk  | [site  ] DEBUG 2025/01/10 15:50:22 site power: 427W
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 pvPower: 2.0304
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 pvEnergy: 0
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 pv: [{2.0304 0 0 <nil> <nil> <nil>}]
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 homePower: 429.0304
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:50:22 charge total import: 11476.858kWh
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/smartCostActive: false
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/smartCostNextStart: 0001-01-01 00:00:00 +0000 UTC
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:50:22 charger status: B
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/chargeCurrent: 0
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:50:22 next soc poll remaining time: 12m15s
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/sessionEnergy: 4.486388888835791
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/sessionSolarPercentage: 0
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/sessionPricePerKWh: 0.3351
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/sessionPrice: 0.0015033889166488734
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/sessionCo2PerKWh: <nil>
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/chargedEnergy: 4.486388888835791
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/chargeDuration: 15s
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/chargeTotalImport: 11476.857793333358
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:50:22 !! pvScalePhases DOWN activePhases: 1, available power: -427W, scalable: false
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/effectivePriority: 0
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:50:22 !! pvScalePhases UP maxPhases: 3, available power: -427W, scalable: false
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:50:22 !! pvScalePhases EXIT
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:50:22 pv charge current: 0A = 0A + -1.86A (427W @ 1p)
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/effectivePlanId: 0
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/effectivePlanTime: 0001-01-01 00:00:00 +0000 UTC
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/effectivePlanSoc: 0
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/effectiveMinCurrent: 8
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/effectiveMaxCurrent: 16
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/effectiveLimitSoc: 100
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/vehicleWelcomeActive: false
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/connected: true
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/charging: false
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/enabled: false
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/mode: pv
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/planProjectedStart: 0001-01-01 00:00:00 +0000 UTC
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/planProjectedEnd: 0001-01-01 00:00:00 +0000 UTC
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 lp-1/planOverrun: 0s
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 greenShareHome: 0.004732531773972194
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 greenShareLoadpoints: 0
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 tariffPriceHome: 0.3592769702100364
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:22 tariffPriceLoadpoints: 0.3604
evcc-wk  | [site  ] DEBUG 2025/01/10 15:50:37 ----
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:50:37 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:50:37 charge currents: [0.007 0.007 0.009]A
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:37 lp-1/chargePower: 0
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:37 lp-1/chargeCurrents: [0.007000000216066837 0.007000000216066837 0.008999999612569809]
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:37 batteryGridChargeActive: false
evcc-wk  | [sunspec] TRACE 2025/01/10 15:50:37 modbus: send 11 61 00 00 00 06 01 03 9c 87 00 32
evcc-wk  | [sunspec] TRACE 2025/01/10 15:50:37 modbus: recv 11 61 00 00 00 67 01 03 64 00 d2 00 39 00 4f 00 48 ff fe 0f d1 0f d2 0f d2 09 19 09 1a 09 1f ff ff 1f 40 ff fd 13 85 ff fe 13 1a ff ff ec e6 ff ff c8 0e ff fc 03 96 02 60 00 00>
evcc-wk  | [sunspec] TRACE 2025/01/10 15:50:37 model 103 block 0 point DCW: 8.121
evcc-wk  | [pv1   ] TRACE 2025/01/10 15:50:37 modbus: send 11 62 00 00 00 06 01 03 f5 74 00 02
evcc-wk  | [pv1   ] TRACE 2025/01/10 15:50:37 modbus: recv 11 62 00 00 00 07 01 03 04 00 00 00 00
evcc-wk  | [site  ] DEBUG 2025/01/10 15:50:37 pv 1 power: 8W
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:37 pvPower: 8.121
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:37 pvEnergy: 0
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:37 pv: [{8.121 0 0 <nil> <nil> <nil>}]
evcc-wk  | [site  ] ERROR 2025/01/10 15:50:38 grid power: outdated
evcc-wk  | [site  ] DEBUG 2025/01/10 15:50:52 ----
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:50:52 charge power: 0W
evcc-wk  | [lp-1  ] DEBUG 2025/01/10 15:50:52 charge currents: [0.007 0.007 0.009]A
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:52 lp-1/chargePower: 0
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:52 lp-1/chargeCurrents: [0.007000000216066837 0.007000000216066837 0.008999999612569809]
evcc-wk  | [cache ] TRACE 2025/01/10 15:50:52 batteryGridChargeActive: false
evcc-wk  | [sunspec] TRACE 2025/01/10 15:50:52 modbus: send 11 63 00 00 00 06 01 03 9c 87 00 32
evcc-wk  | [sunspec] TRACE 2025/01/10 15:50:52 modbus: recv 11 63 00 00 00 67 01 03 64 00 d1 00 39 00 4d 00 4a ff fe 0f d1 0f d2 0f d2 09 1e 09 1a 09 1e ff ff 32 c8 ff fd 13 87 ff fe 13 11 ff ff ec f0 ff ff a2 8b ff fc 03 96 02 60 00 00>
evcc-wk  | [sunspec] TRACE 2025/01/10 15:50:52 model 103 block 0 point DCW: 13.197000000000001
evcc-wk  | [pv1   ] TRACE 2025/01/10 15:50:52 modbus: send 11 64 00 00 00 06 01 03 f5 74 00 02
evcc-wk  | [pv1   ] TRACE 2025/01/10 15:50:52 modbus: recv 11 64 00 00 00 07 01 03 04 00 00 00 00
evcc-wk  | [site  ] DEBUG 2025/01/10 15:50:52 pv 1 power: 13W

@GrimmiMeloni
Copy link
Collaborator

Den zweiten Absatz hab ich nicht verstanden.

Mein Fehler. Ich hatte an einen Forward Proxy gedacht, aber das ist ja im Falle von Traeffik nicht gegeben.

@GrimmiMeloni
Copy link
Collaborator

Irgendwas muss bei dem Problem hier ja aber auch innerhalb des Containers "sichtbar" sein, denn evcc fühlt sich ja unhealthy.

Der Status unhealthy kommt mit hoher Wahrscheinlichkeit weil vorher der Fehler mit Tibber auftritt. Der health Status vom evcc Core basiert darauf das ein Regelzyklus innerhalb von 1 Minute + Regel Intervall erfolgreich durchgelaufen ist. Wenn beim Lesen des Grid Meters ein Fehler auftritt kann nicht geregelt werden. Deshalb geht die Instanz dann bei Dir nach 75s in den unhealthy Zustand. Das ist also nur ein Folgefehler.

@andig
Copy link
Member

andig commented Jan 11, 2025

Trotzdem läuft ja evcc offensichtlich (--- im Log sind die Regelintervalle). Da muss auch das UI erreichbar sein.

@foto-andreas
Copy link
Contributor Author

foto-andreas commented Jan 11, 2025

Ja, das hatten wir ja oben schon festgestellt, wenn man direkt im Container das UI mit curl aufruft. Ich bin ja jetzt auf der Suche nach dem Problem im Container.

Dass sich EVCC nicht wieder selbständig verbindet, löse ich jetzt mit Health und healing-Container.

@foto-andreas
Copy link
Contributor Author

Ich kanns jetzt zeitweise nachstellen, indem ich EVCC mit Tibber starte, aber eine falsche Home-ID in der Config stehen habe. EVCC fährt hoch, setzt den Container erstmal auf starting nach einer Minute dann aber auf healthy. Und siehe da, mit Debug-Log in Traefik gibts folgende Erklär-Zeile im Log:

traefik     | 2025-01-11T18:28:32Z DBG github.com/traefik/traefik/v3/pkg/provider/docker/config.go:200 > Filtering unhealthy or starting container container=evcc-py-evcc-py-95d41e4150b9fed3a1d4928cdc10aa2fcb934be0f3e54c2d52ad2c48b2d6bafa providerName=docker

Traefik filtert unhealthy-Container weg. Das erklärt den 404. Frage also erledigt.

Schönen Sonntag dann!

@GrimmiMeloni
Copy link
Collaborator

Erstmal Danke dafür, daß Du die Erklärung für den unresponsive UI geteilt hast.
Was mir aber noch fehllt ist, warum dies dauerhaft der Fall ist.
Nach meinem jetzigen Verständnis würde ich erwarten, daß sobald Tibber wieder online kommt, der Gridmeter check wieder funktioniert, und der Container healthy wird - und somit dann letztlich auch die UI wieder über Traeffik erreichbar ist. Das scheint aber nicht der Fall zu sein?

@foto-andreas
Copy link
Contributor Author

Das passiert halt nicht. Es gab auch schon ähnliche Tickets dazu (#17094, #7314). Bisher gabs dazu immer die Antwort: "Ohne grid funktionierts halt nicht, Tibber ist schuld. Die müssen das in den Griff bekommen". Ich sehe das anders, es ist aber wohl müßig, das erneut zu diskutieren.

@andig
Copy link
Member

andig commented Jan 12, 2025

Es gab auch schon ähnliche Tickets dazu (#17094, #7314). Bisher gabs dazu immer die Antwort:

V.a. gabs dazu auch schon Fixes die reproduzierbar funktionieren. Lösen die alle Probleme? Anscheinend nicht.

Tibber ist schuld

Offensichtlich haben Gerät oder API ja eine Macke, also klar. Und natürlich lässt sich das lösen. Dafür brauchts aber einen reproduzierbaren Fall den man sich im Debugger anschauen kann. Bisher gibts den nicht. Wenn Du da bessere Ideen hast: gerne!

@foto-andreas
Copy link
Contributor Author

Gerät (Pulse) schließe ich aus, da beider Häuser gleichzeitig betroffen waren. Laut Log kommt offenbar seitens der Tibber-API die Meldung "unable to start stream api" 8 mal, mit Delay ~0s.

Im Stack oben steht EVCC in tibber-pulse:117. Das ist das Close auf dem GraphQL-Client. Vielleicht hängt da was. Zumindest wird das in beiden Stacktrace-Varianten ausgegeben. Vielleicht mal eine TRACE-Ausgabe an den Stellen einbauen, wo ein Retry erwartet wird und ggf. nicht kommt.

Die 8 Retrys hab ich im Code nicht nachvollziehen können.

@Bombarde76
Copy link

Muss jetzt leider erstmal los. Passiert das bei keinem anderen der Tibber-Evcc-Leute? Hätte da jetzt eigentlich ein globaleres Problem erwartet, da bei mir auch immer beide Hütten betrofen sind.

Hallo zusammen,
ich habe das gleiche Problem seit Mitte Dezember, genauso wie du es beschreibst

@andig
Copy link
Member

andig commented Jan 13, 2025

Du verwendest auch Docker?

@Bombarde76
Copy link

ja, ich nutze auch Docker (synology)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Rather clarification than issue
Projects
None yet
Development

No branches or pull requests

4 participants