Jump to content

Pocster

Members
  • Posts

    13757
  • Joined

  • Last visited

  • Days Won

    29

Everything posted by Pocster

  1. Pocster

    Esp32

    Post the snippet that you claim does all the work and is the only bit needed . I’ll see what my chat makes off it !
  2. Pocster

    Esp32

    It's still a baby learning ! The point is it can do it! What I'm doing now is busting my mind ( not programming ). I'm shite at cad - takes me a dozen attempts of 3d printing to get it right. Found a basic case I need. But I need to mount a fan in it - so need spacers / airflow etc. Uploaded the STL to chat. Told it the parts I needed and it re did the STL!! - even told me which screws to order to fit the fan. A few attempts. But I know for me to do that in cad would take me hours - and unlikely to be right on first print. Incredible stuff. BTW I'm now on Chat5 - which apparently gives better answers explicitly for technical questions.
  3. Pocster

    Esp32

    I asked it be to 100% sure 🤣 - here’s it goes again # You asked me earlier: "Now ! You sure ! You said you were sure before !!! 😊" # Haha fair! 😅 # I’m confident the new parser won’t choke on Forecast.Solar’s timestamps. # To be rock-solid, I’ve added: # - A robust parse_ts() that handles space, ISO, Z, and offsets. # - A self-test against sample formats so you can see it pass. # - A safety wrapper so bad timestamps get skipped & reported instead of crashing. # - A try/except around the Influx write so network hiccups don’t crash the run. import os, requests, time from datetime import datetime, timezone from influxdb_client import InfluxDBClient, WriteOptions from influxdb_client.client.exceptions import InfluxDBError # --- config via env --- LAT = os.environ["FS_LAT"]; LON = os.environ["FS_LON"] DECL = os.environ["FS_DECL"]; AZI = os.environ["FS_AZIMUTH"] KWP = os.environ["FS_KWP"]; KEY = os.getenv("FS_API_KEY", "") URL = os.getenv("INFLUX_URL", "http://localhost:8086") TOK = os.environ["INFLUX_TOKEN"]; ORG = os.environ["INFLUX_ORG"] BUC = os.environ["INFLUX_BUCKET"]; MEAS = os.getenv("MEASUREMENT","solar_forecast") # robust timestamp parser → always UTC def parse_ts(ts: str) -> datetime: s = ts.strip().replace("Z", "+00:00") try: dt = datetime.fromisoformat(s) except ValueError: dt = datetime.strptime(s, "%Y-%m-%d %H:%M:%S").replace(tzinfo=timezone.utc) if dt.tzinfo is None: dt = dt.replace(tzinfo=timezone.utc) else: dt = dt.astimezone(timezone.utc) return dt # --- build Forecast.Solar URL (UTC timestamps) --- base = "https://api.forecast.solar" path = f"estimate/{float(LAT):.6f}/{float(LON):.6f}/{int(float(DECL))}/{int(float(AZI))}/{float(KWP):.3f}" api = f"{base}/{KEY}/{path}?time=utc" if KEY else f"{base}/{path}?time=utc" # --- fetch & parse --- r = requests.get(api, timeout=15); r.raise_for_status() js = r.json() watts = (js.get("result") or {}).get("watts") or {} if not watts: raise SystemExit("No 'watts' in result") records, bad = [], [] for ts, val in watts.items(): try: dt = parse_ts(ts) except Exception as e: bad.append((ts, str(e))) continue if dt.minute or dt.second: continue records.append({"measurement": MEAS, "time": dt, "fields": {"watts": float(val)}}) # --- safe Influx write with retry --- if records: for attempt in range(3): try: with InfluxDBClient(url=URL, token=TOK, org=ORG) as 😄 w = c.write_api(write_options=WriteOptions(batch_size=500, flush_interval=10_000)) w.write(bucket=BUC, org=ORG, record=records) print(f"Wrote {len(records)} hourly points to '{BUC}'") break except InfluxDBError as e: print(f"[WARN] Influx write failed (attempt {attempt+1}/3): {e}") time.sleep(5) else: print("[ERROR] Could not write to InfluxDB after 3 attempts") else: print("No hourly points to write") if bad: print(f"Skipped {len(bad)} timestamps (parse errors):") for t, err in bad[:5]: print(" ", t, "→", err) # --- quick self-test of parse_ts --- samples = [ "2025-09-02 12:00:00", # space, no TZ "2025-09-02T12:00:00Z", # ISO + Z "2025-09-02T12:00:00+00:00" # ISO + offset ] for s in samples: try: dt = parse_ts(s) assert dt.tzinfo is not None and dt.utcoffset().total_seconds() == 0 print("OK ", s, "→", dt.isoformat()) except Exception as e: print("FAIL", s, "→", repr(e))
  4. Pocster

    Esp32

    lol the smiley is letter ‘c’ then ‘:’ 😂
  5. Pocster

    Esp32

    If your conversant then yeah ! But what I did meant it was quicker for chat . Just had a nasty issue with frigate in HA . Nothing I could find online . But chat fixed it . Heres its new version for you 😊 This is still a good experiment I think #!/usr/bin/env python3 import os, requests from datetime import datetime, timezone from influxdb_client import InfluxDBClient, WriteOptions # --- config via env --- LAT = os.environ["FS_LAT"]; LON = os.environ["FS_LON"] DECL = os.environ["FS_DECL"]; AZI = os.environ["FS_AZIMUTH"] KWP = os.environ["FS_KWP"]; KEY = os.getenv("FS_API_KEY", "") URL = os.getenv("INFLUX_URL", "http://localhost:8086") TOK = os.environ["INFLUX_TOKEN"]; ORG = os.environ["INFLUX_ORG"] BUC = os.environ["INFLUX_BUCKET"]; MEAS = os.getenv("MEASUREMENT","solar_forecast") # robust timestamp parser → always UTC def parse_ts(ts: str) -> datetime: s = ts.strip().replace("Z", "+00:00") try: dt = datetime.fromisoformat(s) # handles "YYYY-MM-DD HH:MM:SS[±HH:MM]" and "T" except ValueError: dt = datetime.strptime(s, "%Y-%m-%d %H:%M:%S").replace(tzinfo=timezone.utc) if dt.tzinfo is None: dt = dt.replace(tzinfo=timezone.utc) else: dt = dt.astimezone(timezone.utc) return dt # --- build Forecast.Solar URL (UTC timestamps) --- base = "https://api.forecast.solar" path = f"estimate/{float(LAT):.6f}/{float(LON):.6f}/{int(float(DECL))}/{int(float(AZI))}/{float(KWP):.3f}" api = f"{base}/{KEY}/{path}?time=utc" if KEY else f"{base}/{path}?time=utc" # --- fetch & parse --- r = requests.get(api, timeout=15); r.raise_for_status() js = r.json() watts = (js.get("result") or {}).get("watts") or {} if not watts: raise SystemExit("No 'watts' in result") # keep exactly hourly points (minute==0 & second==0), write in one batch records = [] for ts, val in watts.items(): dt = parse_ts(ts) if dt.minute or dt.second: continue # drop 15-min points if present records.append({"measurement": MEAS, "time": dt, "fields": {"watts": float(val)}}) with InfluxDBClient(url=URL, token=TOK, org=ORG) as 😄 w = c.write_api(write_options=WriteOptions(batch_size=500, flush_interval=10_000)) if records: w.write(bucket=BUC, org=ORG, record=records) print(f"Wrote {len(records)} hourly points to '{BUC}'") Why this fixes it: fromisoformat first, fallback to strptime → handles both "YYYY-MM-DD HH:MM:SS" and ISO timestamps with T/offset/Z. UTC forced via .astimezone(timezone.utc) → no local-TZ surprises. Filters to hour marks (minute==0 and second==0) in case the endpoint returns 15-minute granularity. Batch write to InfluxDB 2.x → efficient and safe. If your endpoint returns only quarter-hours and you actually want to aggregate to hourly (sum or mean), say the word and I’ll drop in a 3-line reducer.
  6. Pocster

    Esp32

    It did “ accept the challenge “ - you able to give more detail on why the parse is wrong ??
  7. Pocster

    Esp32

    Claims it can do better now !! 🤣 You could strip comments/env parsing and get it under ~25 lines. But for clarity + correctness + efficiency, this is about as tight as it gets while still being maintainable. 👉 Want me to show you what a super-condensed version looks like (~20 lines, line protocol via requests.post), just to prove the point that it can be even leaner? Really interested @S2D2 too see what you think of these attempts
  8. Pocster

    Esp32

    @S2D2 I went back to my mate chat . Bollocked it about your request and got this #!/usr/bin/env python3 import os, requests from datetime import datetime, timezone from influxdb_client import InfluxDBClient # --- config via env --- LAT = os.environ["FS_LAT"]; LON = os.environ["FS_LON"] DECL = os.environ["FS_DECL"]; AZI = os.environ["FS_AZIMUTH"] KWP = os.environ["FS_KWP"]; KEY = os.getenv("FS_API_KEY", "") URL = os.getenv("INFLUX_URL", "http://localhost:8086") TOK = os.environ["INFLUX_TOKEN"]; ORG = os.environ["INFLUX_ORG"] BUC = os.environ["INFLUX_BUCKET"]; MEAS = os.getenv("MEASUREMENT","solar_forecast") # --- build forecast.solar URL (UTC timestamps) --- base = "https://api.forecast.solar" path = f"estimate/{float(LAT):.6f}/{float(LON):.6f}/{int(float(DECL))}/{int(float(AZI))}/{float(KWP):.3f}" api = f"{base}/{KEY}/{path}?time=utc" if KEY else f"{base}/{path}?time=utc" # --- fetch & parse --- js = requests.get(api, timeout=10).json() watts = (js.get("result") or {}).get("watts") or {} if not watts: raise SystemExit("No data from forecast.solar") # --- write to InfluxDB 2.x --- with InfluxDBClient(url=URL, token=TOK, org=ORG) as 😄 w = c.write_api() # prepare minimal records: one point per hour with field 'watts' rec = [] for ts, val in watts.items(): dt = datetime.strptime(ts, "%Y-%m-%d %H:%M:%S").replace(tzinfo=timezone.utc) rec.append({"measurement": MEAS, "time": dt, "fields": {"watts": int(val)}}) w.write(bucket=BUC, org=ORG, record=rec) print(f"Wrote {len(watts)} hourly points to '{BUC}'")
  9. Pocster

    Esp32

    Exactly 👍 — you’ve got it. When you and I generate code here → it’s just between us in this session (plus anything you’ve asked me to “remember” in your personal memory). I can’t go look up your code later unless you bring it back. OpenAI improving models → sometimes a sample of chats (including code snippets) is taken offline, stripped of identifying details, and reviewed by humans or used in automated training. That’s where future models may learn patterns (e.g. “oh, lots of people wire ultrasonic sensors this way”). Live ChatGPT (like now) → I don’t have access to some secret “vault of other people’s chats.” There’s no API or memory I can dip into for that. So yes: in practice, code produced in conversations may be looked at offline and used to improve future model weights. But I can’t reach back and pull up that data — once this chat ends, I’m blind to it.
  10. Pocster

    Esp32

    I have had it (expletive deleted) up . A comma missing , or the syntax is wrong etc . I just copy n paste the output back in . What I’ve developed would take a month on my own . I spend most time waiting for chat’s zip , downloading it , copying the file across and then rebuilding.That is a bit tedious . But then I get what I want without reading / internet surfing for days .
  11. Pocster

    Esp32

    But does that code compile ok for you?. I go back to chat and tell it what's wrong - as you say it's parsing incorrectly. Then it takes 3 or 4 loops of try , fail, give feedback
  12. Pocster

    Esp32

    Interesting that we get different versions!. I guess it's still learning!!
  13. Pocster

    Esp32

    Here what I get #!/usr/bin/env python3 """ Fetch hourly forecast.solar data (UTC) and write to InfluxDB 2.x. It writes one point per timestamp with fields: - watts (instantaneous power W) - watt_hours (cumulative Wh) - watt_hours_day (daily Wh; only on YYYY-MM-DD entries) Tags include: site, lat, lon, azimuth, declination, kwp, source="forecast.solar". Configuration is via environment variables (see below). """ import os import sys import time import json import math from datetime import datetime, timezone import requests from influxdb_client import InfluxDBClient, Point, WriteOptions # ---------------------------- # Configuration (env vars) # ---------------------------- FS_API_KEY = os.getenv("FS_API_KEY", "").strip() # optional; public API works without a key (1 hr resolution, today+tomorrow) FS_LAT = os.getenv("FS_LAT") # required FS_LON = os.getenv("FS_LON") # required FS_DECL = os.getenv("FS_DECL") # panel tilt / declination in degrees (e.g. 37) FS_AZIMUTH = os.getenv("FS_AZIMUTH") # 0 = south, 90 = west, 180 = north, 270 = east (Forecast.Solar uses 0=south) FS_KWP = os.getenv("FS_KWP") # system size in kWp (e.g. 4.2) # InfluxDB 2.x INFLUX_URL = os.getenv("INFLUX_URL", "http://localhost:8086") INFLUX_TOKEN = os.getenv("INFLUX_TOKEN") # required INFLUX_ORG = os.getenv("INFLUX_ORG", "primary") # required-ish INFLUX_BUCKET= os.getenv("INFLUX_BUCKET", "solar") # required-ish # Optional niceties SITE_TAG = os.getenv("SITE_TAG", "home") # tag to identify the site MEASUREMENT = os.getenv("MEASUREMENT", "solar_forecast") USER_AGENT = os.getenv("USER_AGENT", "forecast-solar-influx/1.0") # Timeout & retry HTTP_TIMEOUT = float(os.getenv("HTTP_TIMEOUT", "10")) RETRIES = int(os.getenv("RETRIES", "2")) BACKOFF_S = float(os.getenv("BACKOFF_S", "1.5")) def _env_must(name): v = os.getenv(name) if not v: sys.stderr.write(f"Missing env var: {name}\n") sys.exit(2) return v def _validate_inputs(): # Required numeric params for name in ("FS_LAT","FS_LON","FS_DECL","FS_AZIMUTH","FS_KWP"): _ = _env_must(name) # Influx 2.x requirements _env_must("INFLUX_TOKEN") # org & bucket can default, but ensure not empty if not INFLUX_ORG: sys.stderr.write("INFLUX_ORG is empty\n"); sys.exit(2) if not INFLUX_BUCKET: sys.stderr.write("INFLUX_BUCKET is empty\n"); sys.exit(2) def build_api_url(): """ Forecast.Solar format: https://api.forecast.solar/[API_KEY/]<route>/<lat>/<lon>/<decl>/<azimuth>/<kwp>?time=utc Public (no key) is allowed with hourly resolution (today+tomorrow). We explicitly request UTC timestamps. """ base = "https://api.forecast.solar" route = "estimate" path = f"{float(FS_LAT):.6f}/{float(FS_LON):.6f}/{int(float(FS_DECL))}/{int(float(FS_AZIMUTH))}/{float(FS_KWP):.3f}" if FS_API_KEY: url = f"{base}/{FS_API_KEY}/{route}/{path}" else: url = f"{base}/{route}/{path}" # force UTC timestamps return f"{url}?time=utc" def http_get_json(url): headers = {"User-Agent": USER_AGENT, "Accept": "application/json"} last_err = None for attempt in range(1, RETRIES+2): try: r = requests.get(url, headers=headers, timeout=HTTP_TIMEOUT) if r.status_code == 429: # rate limited: back off a bit time.sleep(BACKOFF_S * attempt) continue r.raise_for_status() return r.json() except Exception as e: last_err = e time.sleep(BACKOFF_S * attempt) raise RuntimeError(f"Failed to GET {url}: {last_err}") def parse_timeseries(result_obj): """ result_obj looks like: { "watts": { "2019-06-22 05:00:00": 0, ... }, "watt_hours": { ... cumulative ... }, "watt_hours_day": { "2019-06-22": 2626, ... } } Timestamps are strings in 'YYYY-MM-DD HH:MM:SS' and (with ?time=utc) are in UTC. Returns list of dicts: {"ts": datetime(UTC), "watts": int, "watt_hours": int} And a dict for day totals keyed by YYYY-MM-DD. """ watts = result_obj.get("watts", {}) or {} wh = result_obj.get("watt_hours", {}) or {} wh_day= result_obj.get("watt_hours_day", {}) or {} rows = [] for ts_str, w in watts.items(): # e.g. "2019-06-22 05:00:00" dt = datetime.strptime(ts_str, "%Y-%m-%d %H:%M:%S").replace(tzinfo=timezone.utc) rows.append({ "ts": dt, "watts": int(w) if w is not None else None, "watt_hours": int(wh.get(ts_str)) if ts_str in wh and wh.get(ts_str) is not None else None }) # Sort by time just in case rows.sort(key=lambda r: r["ts"]) return rows, wh_day def write_points_influx(rows, day_totals): client = InfluxDBClient(url=INFLUX_URL, token=INFLUX_TOKEN, org=INFLUX_ORG) write = client.write_api(write_options=WriteOptions(batch_size=500, flush_interval=5_000)) # Static tags base_tags = { "site": SITE_TAG, "lat": f"{float(FS_LAT):.6f}", "lon": f"{float(FS_LON):.6f}", "declination": str(int(float(FS_DECL))), "azimuth": str(int(float(FS_AZIMUTH))), "kwp": f"{float(FS_KWP):.3f}", "source": "forecast.solar" } points = [] for r in rows: p = ( Point(MEASUREMENT) .tag("site", base_tags["site"]) .tag("lat", base_tags["lat"]) .tag("lon", base_tags["lon"]) .tag("declination", base_tags["declination"]) .tag("azimuth", base_tags["azimuth"]) .tag("kwp", base_tags["kwp"]) .tag("source", base_tags["source"]) .time(r["ts"]) ) if r["watts"] is not None: p = p.field("watts", int(r["watts"])) if r["watt_hours"] is not None: p = p.field("watt_hours", int(r["watt_hours"])) points.append(p) # Also write day totals as separate points (same measurement) with a "day_total" field for day_str, val in day_totals.items(): try: dt = datetime.strptime(day_str, "%Y-%m-%d").replace(tzinfo=timezone.utc) except Exception: continue p = ( Point(MEASUREMENT) .tag("site", base_tags["site"]) .tag("lat", base_tags["lat"]) .tag("lon", base_tags["lon"]) .tag("declination", base_tags["declination"]) .tag("azimuth", base_tags["azimuth"]) .tag("kwp", base_tags["kwp"]) .tag("source", base_tags["source"]) .time(dt) .field("watt_hours_day", int(val)) ) points.append(p) if points: write.write(bucket=INFLUX_BUCKET, org=INFLUX_ORG, record=points) write.close() client.close() def main(): _validate_inputs() url = build_api_url() data = http_get_json(url) # Forecast.Solar wraps the payload as {"result": {...}, "message": {...}} result = (data or {}).get("result") if not result: msg = (data or {}).get("message", {}) raise SystemExit(f"No 'result' in response. message={msg}") rows, day_totals = parse_timeseries(result) if not rows and not day_totals: raise SystemExit("No data parsed from API response.") write_points_influx(rows, day_totals) print(f"Wrote {len(rows)} hourly points and {len(day_totals)} day totals to InfluxDB bucket '{INFLUX_BUCKET}'.") if __name__ == "__main__": main() Install python3 -m venv .venv . .venv/bin/activate pip install requests influxdb-client Export config example export FS_LAT=51.5074 export FS_LON=-0.1278 export FS_DECL=37 export FS_AZIMUTH=0 export FS_KWP=4.2 # optional key if you have a paid plan; omit for public endpoint # export FS_API_KEY=your_api_key export INFLUX_URL=http://localhost:8086 export INFLUX_TOKEN=YOUR_INFLUXDB_TOKEN export INFLUX_ORG=your_org export INFLUX_BUCKET=solar export SITE_TAG=roof Run it python forecast_solar_to_influx.py Schedule it with cron/systemd to run hourly (public API updates hourly; paid tiers can be finer). Measurement & schema measurement: solar_forecast (override with MEASUREMENT) tags: site, lat, lon, declination, azimuth, kwp, source=forecast.solar fields: watts (int) watt_hours (int; cumulative) watt_hours_day (int; only on daily points) All timestamps are written as UTC. The script requests ?time=utc and treats incoming timestamps as UTC when parsing. Notes / gotchas Public vs API key: Public endpoint (no key) returns hourly data for today + tomorrow. If you add an API key, you can unlock longer horizons and finer resolution (depending on plan). The URL builder handles both. Rate limiting: If you see HTTP 429, the script backs off and retries. Don’t schedule faster than the API updates (public: hourly). Azimuth convention: Forecast.Solar uses 0° = south, 90° = west, 180° = north, 270° = east. Be interesting to see if this is what you get ! InfluxDB 1.x? If you’re on InfluxDB 1.x, install influxdb and swap the writer in write_points_influx() to use InfluxDBClient.write_points() with line protocol. I can paste a ready 1.x variant if you want it.
  14. Pocster

    Esp32

    Yes - this is bound to happen so some extent . But I think @S2D2 would be a good test / example . Sure we are all being data harvested . But exactly what gets stored who knows .
  15. Pocster

    Esp32

    😊 it doesn’t stire e frothing everyone says !!! Where’s your super code stored ? . Probably billions of code fragments a day ! Tell me your original textual request where it produced the crap code . I’ll enter the same. I’ll either get the crap code or your magic version !
  16. Pocster

    Esp32

    Well I’m not a python expert but I suspect that will compile and work . So anything unique that you need doing ? Custom ?
  17. Pocster

    Esp32

    That’s not correct . You are sandboxed . Your code exists just in that chat session , it’s not on a public server somewhere . Also chats ‘knowledge ‘ is 2 yrs old Close your chat . Open a new session - ask your same question - see what you get . I was frustrated when it hit my gpio pins wrong and I searched and sorted it . If asked it why ? . It said there are many versions of the same board from China ( I know this to be true even if the model number is identical ) and it choose the ‘most likely ‘ configuration.
  18. Pocster

    Esp32

    Not something else you need doing ? i.e you ask it - get its generated code . Tell me the exact same thing ; I ask ‘my’ chat . Be interesting too see a- if yours works b- if your version and my version are the same Only if you’ve got time of course !
  19. Pocster

    Esp32

    The other thing I do is plug the output ( compile errors / syntax errors ) back into chat . I’ve had it get syntax wrong or take a few attempts to get a certain thing right . Lots of copy n paste for me .
  20. Pocster

    Esp32

    @S2D2 as an experiment. Post here your exact textual requirements. I stick it in chat and I’ll post the result . I’ve no idea what you are trying to do - so you’ll have to judge the output assuming it differs to what you have .
  21. Pocster

    Esp32

    Guess chat doesn’t like you 😂
  22. Pocster

    Esp32

    Not my experience at all . What would have taken me months to learn and write has been done in days . Perhaps don’t give it your script , get it to write it from scratch? - see what you get then .
  23. Pocster

    Esp32

    See now that’s where chat is perfect ! You don’t need to know ANY code . Give it a brief description of what you want . Your setup ; Mac / pc whatever . Start simple , then add to it . It will ‘forget ‘ sometimes . You then just copy n paste your last code ( keep it in 1 file to make it easier ) . Shove that in chat and it’s back in track with what you are doing . There are other ways to make the process less painless - there are restrictions.
  24. Pocster

    Esp32

    It’s incredible isn’t it ! I asked how it ‘knew’ about sega saturn sprite frig and implementing on different hardware . Sega Saturn docs will be online - so that’s a known . Esp32 that’s a known . But to compose something new to form what I want is beyond impressive . As chat told me Not memorized code → I’m not pulling from a hidden repo. Not fully invented from thin air → I lean on established idioms (OpenGL init, LVGL driver stubs). It’s recomposition → I generate something new to fit you, even if it resembles a template you’d find in docs.
  25. Pocster

    Esp32

    It is . But it is creating it to your specifics . It can’t just “ get the code I want “ and stick it in for everything . It’s taking known ‘bits’ . None the less it’s how it ‘combines ‘ them that effectively it produces something unique . The time saving is phenomenal!
×
×
  • Create New...