


HCSS AI: Construction-Grade Security & Privacy
Your data is your competitive advantage.
Whether it’s historical bid data in HeavyBid, production rates in HeavyJob, or operational insights across your platform, we protect your data with the same level of care we use to protect our own data.
AI solutions should make you more efficient, not introduce risk. Here’s how we ensure that for HCSS’s products.
Your data is your intellectual property
The data you enter into HCSS belongs to you. Your competitive edge stays yours.
- We never share your bid strategies, labor rates, or production data.
- Your proprietary information is not exposed or accessible to other contractors.
- We never use your data to train any AI models for other customers.
Isolated by design
HCSS Copilot operates within your secure tenant environment.
When Copilot drafts a response, whether summarizing a change order or analyzing a variance, it uses only your organization’s data to provide context, not train any AI models.
Contractor A’s production rate will never assist Contractor B. There is no cross-customer data sharing.
Permission-aware AI
Copilot respects the user permissions you’ve already set.
If someone cannot view markup, indirect costs, safety incidents, or financial details in HCSS products, the AI will not show or use that data in its responses. Copilot follows your security rules, not the other way around.
Encrypted end-to-end
Copilot data is protected just like all HCSS data:
- Encrypted in transit;
- Encrypted at rest; and
- Secured within enterprise cloud infrastructure.
There is no separate or reduced security standard for Copilot features.
Built on Azure Enterprise
HCSS Copilot is built with private Microsoft Azure Enterprise Infrastructure and uses Azure OpenAI Models.
This ensures:
- Your data remains within HCSS’s private Azure tenant just as it always has;
- Prompts and outputs are not used to train public AI models;
- Processing stays inside the Azure Trust Boundary; and
- Enterprise-grade security and compliance standards apply.
Your data does not leave our secure environment.
Built for construction, not experimentation
HCSS Copilot is designed to:
- Assist your teams;
- Improve decision speed; and
- Reduce manual effort.
It does not:
- Replace professional judgment;
- Make autonomous financial decisions; and
- Expose your competitive data.
You stay in control, always.
You can choose to disable HCSS Copilot for your environment. If HCSS Copilot is disabled, your users will not have access to the AI features with HCSS Copilot, and your data will not be processed to power HCSS Copilot.
Frequently asked questions
Is my data being used to train AI models?
No. Customer data:
- Is not used to train global AI models;
- Is not used to train other customers’ models; and
- Stays within HCSS’s private Azure tenant.
We use Microsoft Azure OpenAI within the Azure Trust Boundary, meaning prompts and outputs are not used to improve public models.
Is my data shared across customers?
Absolutely not. Each customer operates in an isolated tenant environment. Copilot responses are generated using only that customer’s data. There is no cross-customer pooling or sharing.
Could another contractor’s data influence my results?
No. Copilot responses are scoped exclusively to your tenant. There is no mechanism for cross-contractor learning or leakage.
Can Copilot show information users normally cannot see?
No. Copilot respects existing HCSS permission structures across products. If a user cannot access certain financial or operational fields, Copilot will not surface that information.
Is the data encrypted?
Yes. AI-related data is encrypted in transit, at rest, and meets the same enterprise security standards as the rest of HCSS.
Where does the data live?
Within Microsoft Azure Enterprise infrastructure. All processing happens within HCSS’s secure Azure tenant. Data does not leave the environment.
Are you using public ChatGPT?
No. We use Azure OpenAI within enterprise controls. This is different from consumer-grade public AI tools. Customer data does not flow into public systems.
What happens if Copilot generates an incorrect answer?
Copilot is designed as an assistive tool. Users remain in control, responsible for final approvals, able to review and validate outputs. Copilot does not autonomously submit bids, approve change orders, or alter financial data.
What is “in-context learning” in a construction setting?
In-context learning means Copilot uses your data only to answer your question at that moment. It does not retain, store, or learn from your information afterward.
Think of it like a consultant who reviews your project details to provide an answer, then leaves without keeping any of your documents.
Your data is used to generate the response, not train future models.
Why should we trust Copilot?
Because it:
- Operates within the same secure infrastructure as other HCSS products;
- Respects your data ownership;
- Honors your permission structure;
- Does not expose proprietary information; and
- Assist, not replaces, professional judgment.
To learn more about how we process personal data, please see our Privacy Policy.
Didn't get all of your questions answered?
Request Info