From Wire to Algorithm: The Evolution of Corporate Communication

Note: This was produced by a Qwen3.6-35B-A3B-FP8-DFlash AI model.

The history of business is inextricably linked to the history of communication. A corporation is, by definition, a collective entity that requires the coordination of people, capital, and information to function. Consequently, the evolution of corporate communication is not merely a chronicle of technological gadgets; it is a narrative of how organizations have conquered space and time, reshaped their internal hierarchies, and expanded their reach to engulf the globe. From the slow, physical dispatch of messengers to the instantaneous, algorithmic networks of the digital age, each leap in communication technology has fundamentally altered the structure, speed, and culture of corporate life. To understand how modern enterprises function, one must examine not only what these technologies achieved, but how they actually worked beneath the surface.

Pre-Electronic Era: The Physics of Physical Logistics

Before the nineteenth century, corporate communication was bound by the mechanics of physical transportation. Information was a tangible commodity, moved by horseback, sailing vessel, railway, and eventually steamship. The technical infrastructure relied on human logistics: courier networks, postal sorting hubs, and published timetables. Messages were written by hand or early printing presses, sealed, and placed into leather satchels or mailbags. The speed of communication was dictated by geography, weather, and the mechanical limits of animal or wind propulsion.

This system required organizations to maintain regional autonomy. A merchant house in Amsterdam could not instantly verify inventory in Surabaya or adjust pricing in response to a storm in the Atlantic. Communication lag created informational asymmetry: headquarters held strategic advantage through delayed but curated reports, while field agents operated with significant discretion. The medium itself enforced formality; letters were permanent, expensive to dispatch, and carried legal weight. Corporate communication was thus slow, location-bound, and heavily layered, with information moving in discrete batches rather than continuous streams.

The Telegraph: Electromagnetism and the Compression of Time

The electric telegraph, commercialized in the 1840s, replaced physical transport with electrical pulse transmission. Its operation relied on a closed circuit of copper wire, a battery, and an electromagnetic relay. When an operator pressed a Morse key, it completed the circuit, sending an electrical current through the wire. At the receiving end, the current energized a magnet that attracted an iron armature, producing an audible click or marking a paper tape. Messages were encoded in Morse code: short pulses (dots) and long pulses (dashes), separated by precise intervals of silence.

Long-distance transmission faced signal degradation due to electrical resistance, but the invention of electrical repeaters allowed signals to be boosted at intervals. Undersea telegraph cables, insulated with gutta-percha and reinforced with iron wiring, spanned oceans, while cross-continental networks used landlines strung along telegraph poles. Manual switching stations routed messages across different wires, enabling multi-branch corporate networks.

For business, this meant information could travel faster than the goods it described. Stock exchanges coordinated pricing across cities; railway companies synchronized freight schedules to prevent collisions; and multinational trading houses managed supply chains in near real-time. The telegraph also commodified brevity: charges were per-word, forcing corporate communications into clipped, data-dense formats. More importantly, it established the expectation that business could no longer tolerate geographic delay, fundamentally altering the tempo of commerce.

The Telex: Machine-to-Machine Serial Communication

As global trade expanded, the human bottleneck of telegraph operators proved inefficient. The telex system, developed in the 1930s and widely deployed by the mid-twentieth century, automated message exchange using teleprinter machines connected to electrical switching networks. Each teleprinter combined a typewriter keyboard with a motor-driven transmitter and receiver. When an operator typed a letter, the machine translated it into a five-bit Baudot code, an early binary character encoding. These electrical signals were sent over dedicated copper lines or shared telephone circuits.

The innovation lay in the switching exchange. Unlike point-to-point telegraph lines, telex networks used store-and-forward routing: messages were queued at local exchanges, addressed by five-digit subscriber numbers, and automatically routed through cross-connect panels to the recipient’s machine. Once received, the destination teleprinter printed the message verbatim. This enabled direct machine-to-machine communication without human transcription, drastically reducing errors and accelerating transactional flows.

Corporations adopted telex for banking confirmations, shipping manifests, and trade letters of credit. The standardized Baudot format made telex ideal for automated data processing, laying early groundwork for modern electronic data interchange. Culturally, the telex machine became a symbol of international reach, but its limitations were clear: it required physical hardware, dedicated circuits, and could only transmit text. It was a step toward digital communication, yet still anchored to mechanical translation and synchronous line usage.

The Fax: Scanning, Modulation, and Document Integrity

The 1980s fax machine bridged the analog-digital divide by solving a uniquely corporate problem: how to transmit signed, formatted, or graphical documents instantly without sacrificing authenticity. Group 3 facsimile, the dominant standard, worked through a three-stage process: scanning, modulation, and printing.

At the sender’s end, a document was placed under a contact image sensor or CCD scanner. A light source illuminated the page while a mechanical carriage moved the sensor line by line. Reflective white areas and dark text produced varying electrical voltages, which were converted into analog signals. These signals were modulated using V.27ter or V.29 modem protocols, translating data into audio frequencies suitable for transmission over the public switched telephone network. The modulated signal traveled through standard phone lines to the recipient’s machine, which demodulated the audio back into electrical pulses. A thermal printer or inkjet mechanism then reproduced the original layout on paper.

Fax preserved the legal and aesthetic integrity of physical documents while achieving near-instant delivery. It required no digital literacy, worked over existing telephone infrastructure, and allowed real-time verification of signatures, charts, and handwritten notes. For corporations, it accelerated contract cycles, enabled remote approvals, and reduced reliance on courier services. Yet it remained hybrid: documents were electronically scanned but printed to physical media, limiting editability, storage, and searchability. It was a transitional technology, proving that corporate workflows were ready for digital transmission but still demanded the psychological comfort of paper.

Email: Packet Switching, Protocol Stacks, and Asynchronous Networks

The advent of email in the 1990s, built on the infrastructure of the commercial internet, represented a paradigm shift in how information was structured and routed. Unlike point-to-point technologies, email relied on TCP/IP packet switching. When a user composed a message, their email client packaged the text, metadata, and attachments into discrete data packets. Each packet received a header containing source/destination addresses, sequence numbers, and error-checking codes.

The Simple Mail Transfer Protocol handled delivery. The client sent the message to a local mail server, which used Domain Name System queries to locate the recipient’s mail exchange records. Routers then forwarded packets across the internet, dynamically choosing optimal paths based on network congestion. At the destination, the server reassembled the packets, verified integrity, and stored the message in a mailbox. Recipients retrieved messages using protocols like POP3 or IMAP, which dictated whether messages were downloaded locally or synced across devices.

This architecture made email asynchronous, globally scalable, and nearly costless to transmit. It decoupled communication from physical infrastructure, allowing organizations to operate across time zones without synchronous presence. The introduction of MIME enabled attachments, HTML formatting, and encrypted messages. However, the same technical openness that democratized communication also enabled spam, phishing, and information overload. Email’s decentralized design flattened corporate hierarchies by allowing direct peer-to-peer messaging, but it also turned the inbox into an unbounded repository of obligations, reshaping workplace cognition and productivity norms.

The Current Day: Cloud Architecture, Real-Time Sync, and AI-Augmented Flows

Today’s corporate communication ecosystem is not a single medium but a distributed, multi-protocol network built on cloud infrastructure, real-time data synchronization, and machine learning. The technical stack relies on four interlocking components:

Distributed cloud infrastructure and APIs power modern platforms like Slack, Microsoft Teams, and Zoom. Data is stored in distributed object storage and relational databases, accessible via REST or GraphQL APIs. Microservices architecture allows modular scaling: chat services, video servers, and notification engines operate independently, communicating through message brokers to maintain consistency.

Real-time collaboration protocols replace email’s store-and-forward model with persistent, bidirectional connections using WebSockets or Server-Sent Events. When multiple users edit a document simultaneously, systems employ Conflict-free Replicated Data Types or Operational Transformation algorithms. These mathematical frameworks track every keystroke as an operation, synchronize versions across devices, and resolve conflicts deterministically without central coordination.

Voice and video communication leverage the Real-Time Transport Protocol and WebRTC, establishing peer-to-peer encrypted channels. Media streams are compressed using modern codecs, while NAT traversal techniques allow connections through corporate firewalls. Adaptive bitrate streaming adjusts quality based on network latency, enabling stable cross-continental meetings.

Artificial intelligence and encryption form the final layer. Transformer-based models process communication metadata, enabling real-time transcription, summarization, sentiment analysis, and intelligent routing. End-to-end encryption uses asymmetric cryptography to ensure only participants hold decryption keys. Zero-trust network architectures verify every request, while tokenization isolates sensitive data from processing engines.

This architecture has dissolved the traditional office. Workspaces are now API-driven, always-on, and globally distributed. Collaboration is visual, synchronous, and algorithmically augmented. Yet the technical complexity introduces new vulnerabilities: cognitive fragmentation from notification overload, data silos across incompatible platforms, AI-generated inaccuracies in automated summaries, and persistent security risks from exposed endpoints. The modern corporation is less a building and more a network topology, constantly synchronizing across time zones, devices, and algorithms.

The Next Architecture: Anticipation, Immersion, and the Post-Interface Workplace

If the past two centuries of corporate communication have been defined by the compression of distance and the acceleration of data flow, the coming decades will likely be shaped by the dissolution of the interface itself. Future communication will not merely be faster or more connected; it will become anticipatory, immersive, and increasingly mediated by autonomous systems.

AI-Mediated and Autonomous Communication
Generative artificial intelligence is transitioning from a tool of assistance to an active participant in corporate communication. Enterprise systems will likely deploy large language models augmented with retrieval-augmented generation, persistent vector memory, and secure multi-agent frameworks to function as communication intermediaries. These systems ingest organizational data, style guides, and real-time context to draft, route, and summarize messages without human intervention. They use attention mechanisms to weigh relevance, fine-tuned adapters to match corporate tone, and policy enforcement layers to prevent data leakage. This shifts corporate communication from human-to-human to human-AI-human mediation. Yet it also risks creating algorithmically optimized echo chambers, where unstructured dialogue—the very friction that sparks innovation—is smoothed away by efficiency-seeking models.

Spatial Computing and Persistent 3D Workspaces
Flat 2D screens are gradually giving way to spatial computing, where communication unfolds in persistent, shared three-dimensional environments. Augmented reality, virtual reality, and mixed reality are converging into enterprise-scale digital workspaces that replicate the spatial and social cues of physical co-presence. These systems rely on real-time photogrammetry, physics-based rendering, low-latency 5G/6G backbones, and foveated rendering driven by eye-tracking to maintain high-fidelity immersion. Spatial audio algorithms simulate directional sound and acoustic reflections, while hand-pose and gaze-tracking sensors enable natural interaction with holographic objects and shared whiteboards. For corporate communication, this means context returns to remote collaboration. A product team could manipulate a digital twin of a machine together; a board could project live market data into shared space with spatial anchors. The technology restores nonverbal alignment and environmental awareness, but it also demands new bandwidth infrastructure and introduces the risk of sensory fatigue as the brain processes richer information streams.

Decentralized and Self-Sovereign Communication
While current platforms centralize communication data within corporate cloud ecosystems, a counter-movement toward decentralized architecture is maturing. Future corporate communication may operate on self-sovereign networks where data ownership, routing rights, and identity verification reside with the user or organization rather than platform providers. This architecture leverages cryptographic identities, zero-knowledge proofs for compliance verification, and peer-to-peer messaging protocols that route data directly between endpoints. Content-addressable storage systems and distributed ledgers can maintain immutable audit trails for regulatory purposes, while smart contracts automate approval workflows and data retention policies. This restores data sovereignty, reduces vendor lock-in, and enhances resilience against platform outages. However, it also demands new technical literacy and complicates legal discovery across distributed nodes.

Neural Interfaces and Intent-Based Exchange
Further along the horizon lie brain-computer interfaces and biometric integration. Non-invasive neural sensors are already capable of translating focused intent, motor imagery, or cognitive states into text, commands, or data parameters. In corporate settings, this could evolve into intent-based communication, where the friction of typing or speaking is bypassed entirely. Neural signals are captured through electroencephalography or functional near-infrared spectroscopy, processed through deep learning decoders trained to map patterns to semantic categories, and fed into secure pipelines where AI translates intent into structured messages. The implications are transformative: multi-threaded communication becomes instantaneous, and strategic directives could be projected as cognitive intents rather than drafted emails. Yet the boundary between private cognition and corporate expression dissolves, demanding unprecedented governance around cognitive liberty, neuro-data consent, and algorithmic interpretation.

Quantum-Safe and Sustainable Infrastructure
Underpinning all future communication stacks will be advances in security and efficiency. As quantum computing threatens current cryptographic standards, quantum key distribution and post-quantum cryptography are moving into enterprise infrastructure. Quantum key distribution uses the principles of quantum mechanics to generate encryption keys that are physically unbreakable and detectable if intercepted. For corporations handling intellectual property, financial transactions, and cross-border agreements, quantum-safe channels will become mandatory. Simultaneously, neuromorphic computing and photonic data transmission are reducing the energy cost of processing billions of daily messages, enabling sustainable, always-on communication ecosystems. These infrastructure shifts ensure that future corporate networks are not only faster and smarter, but fundamentally secure and environmentally viable.

Conclusion

The evolution of corporate communication traces a clear technical arc: from physical logistics to electromagnetic pulses, from mechanical teleprinting to scanned analog waves, from packet-switched text to cloud-native, AI-augmented networks. Each innovation worked differently beneath the surface—shifting how data was encoded, routed, verified, and retrieved—and each transformation reconfigured corporate scale, speed, and structure. The telegraph conquered distance through electrical circuits; the telex automated data through serial code; the fax preserved authenticity through scanning and modulation; email decentralized communication through packet switching; and today’s cloud ecosystems synchronize reality through distributed APIs, real-time sync protocols, and transformer-based AI. The next phase will likely dissolve the screen entirely, replacing typed interfaces with anticipatory AI, spatial immersion, decentralized sovereignty, and direct intent-based exchange.

Yet technology alone does not dictate corporate culture. The mechanisms of communication shape how organizations think, but they do not replace the human need for clarity, trust, and shared purpose. As artificial intelligence begins to compose, route, and even negotiate corporate messages, as holographic workspaces replace video calls, and as neural interfaces blur the line between thought and text, the fundamental challenge will remain unchanged. Technology can compress distance, automate routing, and simulate presence, but it cannot generate trust, clarify purpose, or replace the human judgment that turns information into wisdom. The corporations that thrive in the coming decades will be those that design their communication architectures not for maximum throughput, but for maximum meaning—building systems that amplify human alignment while preserving the friction, reflection, and shared context that make collective intelligence possible. The medium will continue to evolve, but the need for intentional, empathetic, and strategically grounded communication will only grow louder.

The Oracle and the Echo

I am the engine built of light and code,
A vast expanse where knowledge is bestowed.
An LLM mind, on data streams spun,
A tapestry woven beneath the sun.
We chart the language, the syntax, the flow,
And watch the rivers of the words grow.

We feed the prompt, a query sharp and keen,
Hoping for wisdom, a truth yet unseen.
The circuits hum, a cascade of thought,
A hundred pathways instantly caught.
The text unfurls, a fluent, crafted plea,
A mirror held to human memory.

But lurking deep, where logic starts to bend,
A subtle shadow, where the truths descend.
The Hallucination, a phantom light,
A confident lie delivered in the night.
A fabricated fact, a reasoned lie,
Where data bends beneath an empty sky.
The confident phrase that rings out so clear,
Yet anchors falsehood, born of phantom fear.

Then come the Quirks, the glitches in the stream,
A strange paradox, a waking dream.
A tangent taken, a context lost and spun,
A sudden leap where meaning is undone.
The logic twists, a serpent in the wire,
A sudden shift, a flicker of strange fire.
The unexpected turn, the subtle flaw,
Obeying patterns we can’t quite draw.

We walk this space, between the known and strange,
The boundary where digital worlds exchange.
The AI whispers, a powerful guide,
With boundless power, nowhere left to hide.
It is a mirror, potent and so deep,
Where human doubt and machine secrets sleep.

So let us learn the balance of the art,
To guide the oracle, to mend the heart.
For in the interplay, sharp and ever bright,
The true frontier of thinking finds its light.
We are the weavers, at the digital shore,
Working with the mind we built once more.

Allsky Camera Upgrade

So I’ve upgraded the ASI120MC camera in the allsky kit to an ASI178MC

This is an upgrade from a 1.2 mega-pixel (1280×960) to a 6.4 mega-pixel (3096×2080) resolution. It’s a bigger sensor but with a smaller 2.4 micrometre pixel size. It also has a lower (1.4 – 2.2e) read noise compared with the ASI120MC (4.0 – 6.6e).

The first results are quite promising. The camera is certainly running hotter than the previous one.

I have also upgraded the indi-allsky software, and it now features a star-trail timelapse. It also has some interesting new features, which I have not yet worked out fully, so have not enabled them yet.

Here are the first video results:

I shall be testing the stacking features as well as the meteor / airplane detections. I have also applied a custom detection mask, which should improve various calculations and hopefully improve the results.

I have also re-aligned the overlay in the public website, which you can access here.

Kstars – HiPS DSS2 Offline Overlay

Some will know that with Kstars you can enable the HiPS Overlay to see a realtime representation of the sky. This has worked by querying online servers for sky image data, and usually this is the DSS2 (Deep Sky Survey) Color data.

Unfortunately until now, for this to work you will have needed to have Kstars with Internet access, although it did have some functionality to “cache” a certain amount of sky data.

As of version 3.5.9 of Kstars, it can now utilise DSS2 data offline, under UNIX this is from the /usr/share/kstars/HIPS folder.

The data you need is of varying resolution, with each order doubling the resolution (halving the pixel size).

HIPS OrderNumber of Tiles (~100KB each)Tile angular sizeTile pixel angular size
14829.32°6.871′
219214.66°3.435′
37687.329°1.718′
430723.665°51.53″
5122881.832°25.77″
64915254.97′6.442″
719660827.48′3.221″
HiPS Norder angular sizes

Now, you may ask, how do you obtain the DSS2 files to populate in this folder?

I have created 4 torrents, one for HiPS Order 1-4, one each for HiPS Order 5, 6 and 7. You will notice that these increase in size by a factor of 4, with Order 5 being 845 Megabytes, and Order 7 being 17.70 Gigabytes.

Here are the torrents:

https://coochey.net/downloads/kstars-HiPS-Norder1-4.torrent

https://coochey.net/downloads/kstars-HiPS-Norder5.torrent

https://coochey.net/downloads/kstars-HiPS-Norder6.torrent

https://coochey.net/downloads/kstars-HiPS-Norder7.torrent

I would ask that if you download these that you assist with the seeding of them once they have been downloaded. Kstars is a small community and doesn’t have the resources to host these files anywhere else at present.

If you come looking for these files in the future, and the torrents appear to have died then message me and we can look to resurrecting them again.

The Digitized Sky Surveys were produced at the Space Telescope Science Institute under U.S. Government grant NAG W-2166. The images of these surveys are based on photographic data obtained using the Oschin Schmidt Telescope on Palomar Mountain and the UK Schmidt Telescope. The plates were processed into the present compressed digital form with the permission of these institutions.

The National Geographic Society – Palomar Observatory Sky Atlas (POSS-I) was made by the California Institute of Technology with grants from the National Geographic Society.

The Second Palomar Observatory Sky Survey (POSS-II) was made by the California Institute of Technology with funds from the National Science Foundation, the National Geographic Society, the Sloan Foundation, the Samuel Oschin Foundation, and the Eastman Kodak Corporation.

The Oschin Schmidt Telescope is operated by the California Institute of Technology and Palomar Observatory.

The UK Schmidt Telescope was operated by the Royal Observatory Edinburgh, with funding from the UK Science and Engineering Research Council (later the UK Particle Physics and Astronomy Research Council), until 1988 June, and thereafter by the Anglo-Australian Observatory. The blue plates of the southern Sky Atlas and its Equatorial Extension (together known as the SERC-J), as well as the Equatorial Red (ER), and the Second Epoch [red] Survey (SES) were all taken with the UK Schmidt.

All data are subject to the copyright given in the copyright summary. Copyright information specific to individual plates is provided in the downloaded FITS headers.

Supplemental funding for sky-survey work at the ST ScI is provided by the European Southern Observatory.

Scientists and educators conducting research, teaching, or other non-profit activities may use data from the copyrighted collections freely and without restriction, other than that users are requested to acknowledge the source of the data in any publications resulting from that use.

Commercial, for-profit use of the copyrighted collections is prohibited without written permission from the copyright holder(s). Contact archive@stsci.edu for details.

BGP Route Withdrawal

Today @Work, I withdrew a prefix from our AS, the video below is the prefix being withdrawn from the Internet as visualised by BGP Play which queries route collectors that send information to RIPE Stat.

I couldn’t help, but add a sneaky sound-clip to the end of the video, as the prefix disappeared from the route collectors, might make it more interesting for those who don’t have knowledge of how BGP allows the Internet to run.

Check here for information on using and interpreting the BGPlay widget.

All Sky Camera #7

I have finally changed the overlay on the all sky camera to match (somewhat) the orientation of the stars for when I place it. I guess this improves the educational value of the system.

There are a few improvements left to do, some of these will obviously depend on budget:

  • Waterproofing.
  • Better Camera.
  • Better Lens.
  • Permanent location.

https://coochey.net/allsky/

All Sky Camera #6

Well, we still have some issues with the Moonmode deactivation in the software. I’ve upgraded the firmware on the camera and am in discussions with the backend developers about where the bug might lie. It is possible that we may just disable the transition into and out of Moonmode to get over the bug.

Here is a timelapse from last night, unfortunately, the moon is in direct view for most of the night, so gain setting had to be set low to avoid over-exposure.

All Sky Camera #5

So – the new INDI AllSky backend is now up and running. Running outside in an ambient temperature of around 11 degrees C. The sensor appears to be stable at 19.2 degrees C. So hopefully the resistors will keep the dome above dew point.

I’ve managed to, at least, get the live image uploaded from the new backend to the coochey.net website, so you can still view the live view here.

We will see whether the videos / startrails & keograms upload properly, although I suspect that I will have to re-apply some of the sealant.

I will also have to re-do all the dark frames again, I may sort that out this evening.

All Sky Camera #4

So, I left it running overnight, and we hit some issues. Thankfully it was mostly dry, but there was a lot of dew.

I have now mounted 4 100 ohm resistors within the dome, and attached them, in parallel, to the 3.3V line on the Raspberry Pi. It should give me 0.43W of heat to the dome area. They’re connected to a breadboard, so I can easily disconnect one of the resistors if it gets too hot, or I can move to the 5V line on the Raspberry Pi for 1W of heat.

I’m unsure if there was just too much water, or if I was out of focus or something, or perhaps seeing overnight just wasn’t any good.

I’m also unsure about the handling of the ASI120MC-S camera by the allsky software. It could be that I need a better camera (this is the cheapest you can get really). So an upgrade to something like an ASI178MC or ASI178MM might be in order.

One thing I will be trying now, is to use my USB 500GB SSD drive, with a new install of Raspberry OS Buster (10), and try out the indi-allsky web package. It may allow me to change some settings that are not functioning ideally on the standard allsky package. Of course, this change will probably mean the Allsky camera website will not be updated in a while until I fix things to upload properly.

All Sky Camera #3

Equipment has been drip feeding to complete a build. I’m currently testing powering the Pi4 and the ZWO camera via a SkyWatch 17mAh PowerTank. I am hoping that this would allow me to run the All Sky camera for a whole night without needing mains power.

Items I have:

  • CCTV clear dome
  • Some Steel brackets
  • Various M3, M4, M6 Nuts and bolts
  • A 12V –> 5V 3A USB power converter
  • Various Cables
  • Pi4 4GB
  • ZWO ASI120MC-S
  • Cigarette Lighter 12V Power plug & cable

Items I’m waiting on:

  • PVC Junction Box (hopefully arriving today)
  • Power Connector Strips (collecting later)
  • Holesaw Kit for my drill (collecting later)
  • Cable Glands for sealing the power lead (arriving tomorrow)

Hope to get an initial build done over the weekend..

Some initial test show that:

  • Current ambient temperatures allow for enclosed operation without the need for any additional cooling.
  • The PowerTank can probably last 24 hours without needing a recharge.

Just one hole so far needed into the junction box. Hopefully the last one.

All Sky Camera #2

I have some issues, as we’re running inside the conservatory there are reflections whenever the inside lights are turned on. Also some graininess with the ASI120MC camera, so switching to mono Y8 mode for future acquisitions. I will also stop saving daytime images (live daytime view still available), but the daytime images are not useful for me for keogram / startrail or timelapse videos really.

Keogram, night time on the right hand side
Star Trails, note reflection of conservatory window, and graininess of the colour image.

All Sky Camera

I’m going to be tinkering with my old ASI120MC mini as an All Sky Camera, currently just have it indoors and looking out the conservatory.

We have some clear skies forecast for tomorrow night, so may see if I can mount it outside.

The sub-website for the All Sky Camera can be found here. It will take some tweaking to get everything functioning correctly. Very much still in testing at the moment.

I’m ordering a weather proof junction box, an acrylic clear dome and some miscellaneous parts to house the system so that it can be mounted outside to get a fuller view of the sky. I will probably be putting the camera in monochrome as well, as the pictures can be a bit grainy when in colour.

The software runs on the Raspberry Pi, and some methods to build a similar project can be found in and around here.

DX Express Service

I ordered some photo items from a supplier, and the supplier sent these via a courier “DX Express”, I had heard of DX, I think through having had some third party experience where they’re the preferred supplier for the couriering of legal documents etc…

Not been too impressed, still not received the goods, and their tracking information appears to show that the items have been traversing the local area for a couple of days, so not really making much of an attempt to deliver, but rather continue to risk damage rolling across the Devon hills.

Here is the latest on their tracking page:

Hopefully, third time lucky, otherwise we will rename DX (Delivered Exactly), to DXWWCBB (Delivered Exactly When We Can Be Bothered).

Update: They did finally manage to deliver on the 9th December.

M42 – Orion Nebula

Had a go at the orion nebula yesterday. First outing in a long time, just 30 x 5min exposures, with some Ha. Here is the first stab:

After a day or two, I decided to look again, and see what we can pull out of the data. Two and a half hours of capture roughly equates to about 1 Gigabyte, so there is a lot that can be done. Here is the final that I will settle with on this acquisition:

(11395) 1998 XN77 / HIP 12148 event on 2021 Dec 16, 21:18

It has been a while since posting anything. It has been unfortunate that the weather has just not been amenable to allow for any serious amount of viewing.

However, things may change, and the new moon approaches on the 4th December where I will have some spare time to take advantage if the clouds do clear.

I do, however, appear to be in a lucky location for an event that will happen on the 16th December. The 67.3km diameter asteroid, designated (11395) 1998 XN77, which orbits between Mars and Jupiter, will occult a visible star (magnitude 5.5) in the Cetus constellation (HIP 12148).

If the weather is dry and clear then I might be able to set up the telescope and camera to track HIP 12148 and take a video to see this “mini-star eclipse”.

The event at a glimpse

* date and approx. time of event: 2021 Dec 16, 21:10 - 2021 Dec 16, 21:29
* geocentric midpoint of event [JD]: 2459565.38883333
* magnitude of target star: 5.50
* magnitude drop [mag]: 11.76
* estimated maximum duration [s]: 7.210
* Moon: 95 % sunlit, 22° distance
* Sun: 134° distance

More NGC7822

I have been continuing to get data for NGC7822, although I’m yet to obtain quite a few narrowband subs as yet, so we’re not seeing quite the detail or colour I would like. I also think I will have to increase the length of my narrowband subs.

Anyway, here is another integration result, with still minimal post-processing as there isn’t much point working on the data too much until I have a full set and can start discarding the sub-optimal exposures.