-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor/auth and role refactoring #167
Conversation
WalkthroughThis pull request introduces a comprehensive enhancement to the application's data model and authentication system. The primary focus is on establishing a relationship between customers and users by adding a Changes
Sequence DiagramsequenceDiagram
participant Client
participant AuthService
participant RolesService
participant DatabaseService
Client->>AuthService: Request with JWT/API Key
AuthService->>RolesService: Check Table Permissions
RolesService-->>AuthService: Permission Response
alt Public Table Access
AuthService->>DatabaseService: Perform Operation
else Authenticated Access
AuthService->>DatabaseService: Validate User-Record Relationship
DatabaseService-->>AuthService: Record Details
alt User Has Permission
AuthService->>DatabaseService: Perform Operation
else Unauthorized
AuthService-->>Client: Access Denied
end
end
Possibly related PRs
Poem
Tip CodeRabbit's docstrings feature is now available as part of our Early Access Program! Simply use the command 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
if (!permission.valid) { | ||
errored++ | ||
errors.push({ | ||
item: body.indexOf(item), |
Check failure
Code scanning / CodeQL
Type confusion through parameter tampering Critical
this HTTP request parameter
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix AI about 1 month ago
To fix the problem, we need to ensure that the body
parameter is of the expected type before performing any operations on it. Specifically, we should check if body
is an array or a string at the beginning of the deleteMany
method. If body
is not of the expected type, we should return a 400 Bad Request response immediately.
The best way to fix the problem without changing existing functionality is to add a type check for body
at the beginning of the deleteMany
method. This will ensure that body
is either a string or an array before any further operations are performed on it.
-
Copy modified lines R167-R169
@@ -166,2 +166,5 @@ | ||
): Promise<DeleteManyResponseObject> { | ||
if (typeof body !== 'object' || !Array.isArray(body)) { | ||
return res.status(400).send(this.response.text('Body must be an array')) | ||
} | ||
const x_request_id = headers['x-request-id'] |
if (!permission.valid) { | ||
errored++ | ||
errors.push({ | ||
item: body.indexOf(item), |
Check failure
Code scanning / CodeQL
Type confusion through parameter tampering Critical
this HTTP request parameter
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix AI about 1 month ago
To fix the problem, we need to ensure that the body
parameter is properly validated to be either an object or an array of objects before it is used. This can be done by adding a type check at the beginning of the create
method to ensure that body
is either an object or an array of objects. If body
is not of the expected type, we should return a 400 Bad Request response.
-
Copy modified lines R41-R43
@@ -40,2 +40,5 @@ | ||
): Promise<FindOneResponseObject | CreateManyResponseObject> { | ||
if (typeof body !== 'object' || (Array.isArray(body) && !body.every(item => typeof item === 'object'))) { | ||
return res.status(400).send(this.response.text('Invalid body format')); | ||
} | ||
const x_request_id = headers['x-request-id'] |
if (!(body instanceof Array)) { | ||
return res.status(400).send(this.response.text('Body must be an array')) | ||
} | ||
const total = body.length |
Check failure
Code scanning / CodeQL
Type confusion through parameter tampering Critical
this HTTP request parameter
Potential type confusion as
this HTTP request parameter
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix AI about 1 month ago
To fix the problem, we need to ensure that the body
parameter is of the expected type before any operations are performed on it. This can be done by adding a type check at the beginning of the updateMany
function. If the body
is not an array, we should immediately return a 400 Bad Request response.
-
Copy modified lines R178-R180
@@ -177,2 +177,5 @@ | ||
): Promise<UpdateManyResponseObject> { | ||
if (!Array.isArray(body)) { | ||
return res.status(400).send(this.response.text('Body must be an array')); | ||
} | ||
const x_request_id = headers['x-request-id'] |
if (!validate.valid) { | ||
errored++ | ||
errors.push({ | ||
item: body.indexOf(item), |
Check failure
Code scanning / CodeQL
Type confusion through parameter tampering Critical
this HTTP request parameter
Potential type confusion as
this HTTP request parameter
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix AI about 1 month ago
To fix the problem, we need to ensure that the body
parameter is not only an array but also that its elements are of the expected type. We can achieve this by adding a type check for the elements within the body
array. This will prevent type confusion attacks by ensuring that the body
parameter and its elements are of the expected type.
- Add a type check for the elements within the
body
array. - Ensure that the
body
parameter is an array of objects before proceeding with the rest of the function logic.
-
Copy modified lines R226-R230
@@ -225,2 +225,7 @@ | ||
} | ||
for (const item of body) { | ||
if (typeof item !== 'object' || item === null) { | ||
return res.status(400).send(this.response.text('Each item in the body must be a non-null object')) | ||
} | ||
} | ||
const total = body.length |
if (!validateKey.valid) { | ||
errored++ | ||
errors.push({ | ||
item: body.indexOf(item), |
Check failure
Code scanning / CodeQL
Type confusion through parameter tampering Critical
this HTTP request parameter
Potential type confusion as
this HTTP request parameter
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix AI about 1 month ago
To fix the problem, we need to ensure that the body
parameter is of the expected type (an array) as soon as it is received. This can be done by adding a type check at the beginning of the updateMany
method. If the type is not as expected, we should return a 400 Bad Request response immediately.
-
Copy modified lines R178-R180
@@ -177,2 +177,5 @@ | ||
): Promise<UpdateManyResponseObject> { | ||
if (!Array.isArray(body)) { | ||
return res.status(400).send(this.response.text('Body must be an array')); | ||
} | ||
const x_request_id = headers['x-request-id'] |
if (!uniqueValidation.valid) { | ||
errored++ | ||
errors.push({ | ||
item: body.indexOf(item), |
Check failure
Code scanning / CodeQL
Type confusion through parameter tampering Critical
this HTTP request parameter
Potential type confusion as
this HTTP request parameter
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix AI about 1 month ago
To fix the problem, we need to ensure that the body
parameter is checked for its type as soon as it is received. This will prevent any type confusion attacks by ensuring that body
is an array before any operations are performed on it. The best way to fix this without changing existing functionality is to add a type check for body
immediately after it is received and before any other operations are performed.
-
Copy modified lines R178-R180
@@ -177,2 +177,5 @@ | ||
): Promise<UpdateManyResponseObject> { | ||
if (!(body instanceof Array)) { | ||
return res.status(400).send(this.response.text('Body must be an array')) | ||
} | ||
const x_request_id = headers['x-request-id'] |
if (!record) { | ||
errored++ | ||
errors.push({ | ||
item: body.indexOf(item), |
Check failure
Code scanning / CodeQL
Type confusion through parameter tampering Critical
this HTTP request parameter
Potential type confusion as
this HTTP request parameter
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix AI about 1 month ago
To fix the problem, we need to ensure that the body
parameter is of the expected type (an array) before using it. We can add a type check at the beginning of the updateMany
method to validate that body
is an array. If it is not, we should return a 400 Bad Request response.
-
Copy modified lines R178-R180
@@ -177,2 +177,5 @@ | ||
): Promise<UpdateManyResponseObject> { | ||
if (!Array.isArray(body)) { | ||
return res.status(400).send({ message: 'Invalid request body' }); | ||
} | ||
const x_request_id = headers['x-request-id'] |
} catch (e) { | ||
errored++ | ||
errors.push({ | ||
item: body.indexOf(item), |
Check failure
Code scanning / CodeQL
Type confusion through parameter tampering Critical
this HTTP request parameter
Potential type confusion as
this HTTP request parameter
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix AI about 1 month ago
To fix the problem, we need to ensure that the body
parameter is an array before processing it. This can be done by adding a type check at the beginning of the updateMany
method. If the body
is not an array, we should return a 400 Bad Request response.
-
Copy modified lines R178-R180
@@ -177,2 +177,5 @@ | ||
): Promise<UpdateManyResponseObject> { | ||
if (!Array.isArray(body)) { | ||
return res.status(400).send(this.response.text("Invalid request body")); | ||
} | ||
const x_request_id = headers['x-request-id'] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 9
🧹 Nitpick comments (40)
src/testing/auth.testing.service.ts (1)
35-44
: Consider adding input validation for table and access_levelThe method accepts parameters without validation. Consider adding checks for:
- Empty or invalid table name
- Valid RolePermission enum value for access_level
async createPublicTablesRecord(data: { table: string access_level: RolePermission }): Promise<FindOneResponseObject> { + if (!data.table?.trim()) { + throw new Error('Table name is required') + } + if (!Object.values(RolePermission).includes(data.access_level)) { + throw new Error('Invalid access level') + } const schema = await this.schema.getSchema({ table: LLANA_PUBLIC_TABLES, x_request_id: 'test' }) return (await this.query.perform(QueryPerform.CREATE, <DataSourceCreateOneOptions>{ schema, data, })) as FindOneResponseObject }src/app.controller.delete.ts (2)
61-77
: Enhance error handling for public access checkThe public access check could be enhanced with more specific error messages and logging.
// Is the table public? let auth = await this.authentication.public({ table: table_name, access_level: RolePermission.DELETE, x_request_id, }) // If not public, perform auth if (!auth.valid) { + this.logger.debug(`Public access denied for table ${table_name}, attempting authentication`) auth = await this.authentication.auth({ table: table_name, x_request_id, access: RolePermission.DELETE, headers: req.headers, body: req.body, query: req.query, }) if (!auth.valid) { + this.logger.debug(`Authentication failed for table ${table_name}: ${auth.message}`) return res.status(401).send(this.response.text(auth.message)) } }
118-131
: Add transaction support for role permission checksConsider wrapping the role permission check in a transaction to ensure data consistency.
The role permission check and subsequent delete operation should be atomic to prevent race conditions where permissions might change between the check and the actual deletion.
src/app.controller.put.ts (1)
Line range hint
54-89
: Enhance public access check with cachingConsider caching the public access check results to improve performance.
The public access check is performed frequently and could benefit from caching. Consider implementing a cache with a short TTL to reduce database queries.
src/helpers/Authentication.ts (1)
39-43
: Add rate limiting for public access checksConsider implementing rate limiting for public access checks to prevent abuse.
The public access endpoint could be vulnerable to DoS attacks. Consider implementing rate limiting based on IP address or other identifiers.
src/app.controller.delete.test.spec.ts (1)
89-92
: Use beforeEach for test data setupMoving customer creation to beforeEach would ensure a clean state for each test.
-customers.push(await customerTestingService.createCustomer({ userId })) -customers.push(await customerTestingService.createCustomer({ userId })) -customers.push(await customerTestingService.createCustomer({ userId })) -customers.push(await customerTestingService.createCustomer({ userId })) +beforeEach(async () => { + const customer = await customerTestingService.createCustomer({ userId }) + customers.push(customer) +})src/app.controller.get.ts (2)
52-53
: Address the TODO for restricted table access.Currently, the method returns all tables. Consider implementing a filter that only returns tables accessible to the requesting user, based on role permission checks or public table entries.
84-110
: Comprehensive role checks with meaningful error messages.
- The role-based restriction logic and explicit 401 error on failures is well-structured.
- Consider logging or tracing more details (e.g., user_identifier, table_name) when permission is denied, if not already done at a higher level.
src/helpers/Query.ts (1)
Line range hint
568-606
: Relation-building logic inbuildRelations
.
- Excellent approach to fetch related records via a separate query.
- Throwing an error if the required primary field is missing ensures correct usage of
fields
.- Consider carefully limiting the default relation limit of 9999. For large data sets, a pagination approach may be preferable.
src/app.service.bootup.ts (2)
Line range hint
75-120
:_llana_public_tables
schema initialization.
- Simplified “access_level” replaces older fields.
- Defaults to enum
[READ, WRITE, DELETE]
. Confirm whether partial READ (like read-only certain columns) is needed or is handled elsewhere.
137-137
: Example record for'Employee'
in_llana_public_tables
.This is good for demonstration. Consider removing or parameterizing sample data in production builds.
src/app.controller.get.test.spec.ts (3)
53-54
: New variables: userId and user
DeclaringuserId
anduser
is crucial for tracking user entities across tests. Consider using more specific types rather thanany
to improve type safety.
296-297
: TODO: Expand testing
“TODO expand to test enums, relations, and other fields” highlights future test coverage improvements. Be sure to track this.Would you like help creating additional test cases for these data types?
695-739
: Allowed Fields: Overriding “fields” query param
Notably, the role’s allowed fields override the user-specified fields. The minor misspelling in the test name (“fields passe”) is a small nitpick.- it('When allowed_fields are passed, only return these fields even with fields passe, with relations', ... + it('When allowed_fields are passed, only return these fields even with fields passed, with relations', ...src/testing/customer.testing.service.ts (1)
Line range hint
18-35
:mockCustomer(userId: any)
method
IncorporatinguserId
into the mocked customer fosters accurate user-customer relationships for tests. Usingany
is functional, though you might consider stronger typing if feasible.src/helpers/Roles.ts (10)
28-32
: Consider Making Docstring More DescriptiveYour docstring here outlines high-level steps. A short example usage or typical scenario might help other developers quickly understand the role-checking flow.
40-40
: Ensuredata
Is Properly DocumentedThe optional
data
property significantly impacts the permission logic (ownership checks, etc.). Add a quick note explaining what fields withindata
are required for create/update.
51-51
: Use a More Specific Variable Name
permission_result
is descriptive enough, but if more specific terms likecachedPermissionResult
are used, it reduces confusion regarding the result’s origin.
53-57
: Add a Graceful Fallback When Cache Is Not AvailableIf the cache is down or if there's a problem retrieving items, the logic seamlessly continues. This approach is good, but consider logging any cache-related error or fallback notices to help diagnose environment issues.
141-141
: Consolidate Permission ChecksYou check
comparePermissions(permission.records, options.access)
multiple times. Consider consolidating the repeated logic or caching the result in a local variable.
144-144
: Document Field Name Transformations
this.formatAllowedFields(permission.allowed_fields)
is straightforward, but any transformations done inside that method might be surprising to future maintainers. A quick inline comment clarifying is recommended.
154-179
: Refine Ownership-Checks LogicThis block thoroughly enforces “own_records” restrictions. However, adding more context in error messages (e.g., which field fails) might help debugging.
189-189
: Visibility of Allowed FieldsAnother usage of
formatAllowedFields
. It may be helpful to enumerate the returned fields in logs for debugging.
223-223
: Consolidate Default vs. Custom LogicYou also perform similar checks for custom and default permissions. Consider factoring them into a helper method if it reduces duplication.
226-226
: Track Field-Level Permission DifferencesHere, you pass
allowed_fields: this.formatAllowedFields(permission.allowed_fields)
. If you plan expansions, like read vs write fields, you might want to record them separately.src/datasources/mongo.datasource.ts (1)
356-361
: Avoid Key Deletion Errors infindMany
Same logic as above. A quick check ensures that each
results[r]
is valid before accessing or deleting properties.src/helpers/Schema.ts (2)
633-636
: Provide More Relation Failure ContextIf the user attempts a nested relation that doesn’t exist, you're logging
Relation field {items[i]} not found...
. Adding details about the parent relation or the query might help in debugging.
638-641
: Optimize Redundant Error LoggingYou first log the error, then throw an
Error
with a nearly identical message. Consider a single, cohesive error-handling approach or a thorough log statement that references a unique error code.demo/databases/mongodb.js (4)
Line range hint
158-172
: Ensuring correct user referencing in theCustomer
collection.Reusing the same
userId: user._id
might be intended if only one user is needed. If supporting multiple users, plan for how you would handleuserId
in patches or additional seeds.
Line range hint
173-187
: Potential for relational data filtering.If you need to filter or restrict customers by user, ensure that queries on
Customer.userId
use consistent indexing. MongoDB typically needs an index onuserId
if queries filter frequently by it.
Line range hint
188-202
: No input validation or error handling.Although this is a seed script, consider basic validation or try/catch blocks around insert statements to gracefully handle unexpected data scenarios or insertion errors.
Line range hint
203-217
: Repeated code pattern.Each insert block follows the same structure. If the script grows, factor out the insertion logic into a helper function. This fosters DRY principles and code clarity.
demo/databases/json/Customer.json (3)
47-47
: ApplyinguserId: 1
consistently.All customers refer to the same user. This is fine in a single-tenant or simple environment. If multi-tenant, clarify any plan for distributing user IDs across multiple customers.
92-92
:userId
usage aligned with other data.Check if there's a place in the code that automatically assigns
userId
on new customers. If so, ensure it doesn't conflict with the static1
here.
137-137
: All customer entries referencing the same user might limit real scenarios.The approach is understandable for single-user or demo data. In real production, distributing or randomizing user IDs for different customers might better simulate real usage.
src/app.controller.post.ts (2)
129-131
: Publish changes to WebSocket and Webhook.Publishing immediately after each successful insert ensures real-time updates for both internal consumers (
websocket
) and external consumers (webhook
). Consider a bulk-processing approach if the performance overhead of repeated calls is a concern in high-volume use cases.
234-246
: Filtering results based on allowed fields.This is a solid mechanism to prevent overexposure of data. If any nested objects or arrays are part of the data, consider whether a deep filter is needed to avoid exposing nested sensitive fields.
demo/databases/mssql.sql (1)
174-193
: Diversify user associations in demo dataAll customers are associated with
userId = 1
. While this works for demo data, it creates an unrealistic data distribution and could mask potential issues with the user-customer relationship.Consider distributing customers across multiple users to better represent real-world scenarios and test the cascade delete behavior.
demo/databases/postgres.sql (2)
91-108
: Document the purpose of the new Supplier tableA new
Supplier
table has been added without context in the PR description. While the schema looks well-designed, its relationship to the auth/role refactoring is unclear.Consider:
- Adding a
userId
column for consistency with the Customer table if suppliers should be associated with users- Documenting the purpose and relationships of this table in the PR description
- Including any planned features or requirements that necessitated this addition
Line range hint
45-60
: Ensure consistent constraints across database implementationsThere are inconsistencies in the foreign key behaviors across the database implementations:
- MSSQL:
ON UPDATE NO ACTION
- MySQL/PostgreSQL:
ON UPDATE RESTRICT
Additionally, PostgreSQL has unique changes (email nullability, Supplier table) not present in other implementations.
Maintain consistency across database implementations unless there's a specific reason for the differences. Document any intentional variations in the PR description.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (30)
demo/databases/airtable.ts
(1 hunks)demo/databases/auth.sql
(0 hunks)demo/databases/json/Customer.json
(10 hunks)demo/databases/mongodb.js
(11 hunks)demo/databases/mssql.sql
(3 hunks)demo/databases/mysql.sql
(3 hunks)demo/databases/postgres.sql
(6 hunks)src/app.constants.ts
(1 hunks)src/app.controller.delete.test.spec.ts
(5 hunks)src/app.controller.delete.ts
(6 hunks)src/app.controller.get.test.spec.ts
(7 hunks)src/app.controller.get.ts
(6 hunks)src/app.controller.post.test.spec.ts
(5 hunks)src/app.controller.post.ts
(8 hunks)src/app.controller.put.test.spec.ts
(7 hunks)src/app.controller.put.ts
(4 hunks)src/app.service.auth.ts
(1 hunks)src/app.service.bootup.ts
(7 hunks)src/datasources/mongo.datasource.ts
(2 hunks)src/datasources/mysql.datasource.ts
(0 hunks)src/helpers/Authentication.test.spec.ts
(0 hunks)src/helpers/Authentication.ts
(4 hunks)src/helpers/Query.ts
(3 hunks)src/helpers/Roles.test.spec.paused.ts
(0 hunks)src/helpers/Roles.ts
(7 hunks)src/helpers/Schema.ts
(2 hunks)src/modules/websocket/websocket.service.ts
(1 hunks)src/testing/auth.testing.service.ts
(2 hunks)src/testing/customer.testing.service.ts
(3 hunks)src/types/auth.types.ts
(1 hunks)
💤 Files with no reviewable changes (4)
- demo/databases/auth.sql
- src/datasources/mysql.datasource.ts
- src/helpers/Roles.test.spec.paused.ts
- src/helpers/Authentication.test.spec.ts
🧰 Additional context used
🪛 GitHub Check: CodeQL
src/app.controller.post.ts
[failure] 100-100: Type confusion through parameter tampering
Potential type confusion as this HTTP request parameter may be either an array or a string.
src/app.controller.delete.ts
[failure] 230-230: Type confusion through parameter tampering
Potential type confusion as this HTTP request parameter may be either an array or a string.
src/app.controller.put.ts
[failure] 226-226: Type confusion through parameter tampering
Potential type confusion as this HTTP request parameter may be either an array or a string.
Potential type confusion as this HTTP request parameter may be either an array or a string.
[failure] 238-238: Type confusion through parameter tampering
Potential type confusion as this HTTP request parameter may be either an array or a string.
Potential type confusion as this HTTP request parameter may be either an array or a string.
[failure] 248-248: Type confusion through parameter tampering
Potential type confusion as this HTTP request parameter may be either an array or a string.
Potential type confusion as this HTTP request parameter may be either an array or a string.
[failure] 268-268: Type confusion through parameter tampering
Potential type confusion as this HTTP request parameter may be either an array or a string.
Potential type confusion as this HTTP request parameter may be either an array or a string.
[failure] 296-296: Type confusion through parameter tampering
Potential type confusion as this HTTP request parameter may be either an array or a string.
Potential type confusion as this HTTP request parameter may be either an array or a string.
[failure] 355-355: Type confusion through parameter tampering
Potential type confusion as this HTTP request parameter may be either an array or a string.
Potential type confusion as this HTTP request parameter may be either an array or a string.
🪛 Biome (1.9.4)
src/app.controller.put.ts
[error] 358-358: Unnecessary continue statement
Unsafe fix: Delete the unnecessary continue statement
(lint/correctness/noUnnecessaryContinue)
[error] 223-223: Use Array.isArray() instead of instanceof Array.
instanceof Array returns false for array-like objects and arrays from other execution contexts.
Unsafe fix: Use Array.isArray() instead.
(lint/suspicious/useIsArray)
🔇 Additional comments (95)
src/testing/auth.testing.service.ts (1)
31-33
: LGTM: getUserId implementation is correct
The implementation correctly delegates to the authService for retrieving the user ID from the JWT.
src/app.controller.delete.ts (1)
223-225
:
Use Array.isArray() instead of instanceof Array
The instanceof Array
check is not reliable for arrays from different contexts.
-if (!(body instanceof Array)) {
+if (!Array.isArray(body)) {
return res.status(400).send(this.response.text('Body must be an array'))
}
Likely invalid or redundant comment.
src/app.controller.get.ts (9)
64-64
: Unused variable role_where
.
Although role_where
gets appended to in lines 108–110, it is never subsequently used in this method. Ensure either to apply it in the query or remove it to avoid confusion.
72-75
: Implementation of public check for schema retrieval.
Good approach for verifying a table’s public status before enforcing stricter authorization. This layered logic helps unify permission checks and fosters a consistent security model.
79-81
: Conditional role-based auth for non-public tables.
Using authentication.auth
only when authentication.public
fails is a clean approach to separate public from private logic.
152-153
: Similar “public” logic in getById
.
Reusing the public check is consistent. Ensures uniform permission rules across different get endpoints.
161-171
: Auth fallback block in getById
.
The fallback to authentication.auth
for non-public tables remains in line with the approach from getSchema
. Maintaining consistent logic across these routes improves maintainability.
203-212
: Role-based permission restriction for single-record retrieval.
- The code properly merges restrictions into
options.where
or callsconvertDeepWhere
for nested relations. - Verify that you’re handling arrays of restrictions or joined clauses if advanced role rules are needed.
214-214
: Silent fallback if no role restrictions.
If permission checks pass and no further restrictions are added, the function simply proceeds. No issues here.
341-344
: Public check on the list method.
Reusing the same logic for listing records ensures consistent security checks with the other endpoints. Good job maintaining the pattern.
392-401
: Apply “deep where” restrictions for role-based queries in bulk listing.
Similar to getById
, the code merges nested restrictions for user-based filtering via deep relational checks. This approach is flexible but also complex—confirm robust test coverage for tricky edge cases.
src/helpers/Query.ts (3)
24-24
: Newly imported DataSourceSchemaRelation
type.
This addition clarifies relation handling, aligning with the updated relation-based queries.
618-624
: getTableRelationColumn
method usage.
- Straightforward utility to pick the correct column based on the current table.
- Double-check that
org_column
is always defined for relations that originate on a different table.
630-642
: getChildTableRelation
method usage.
- Symmetric logic to retrieve “child” side of a relation.
- Be mindful of consistent naming conventions in your schema to avoid confusion with
column
vs.org_column
.
src/app.service.bootup.ts (4)
9-9
: Use of LLANA_PUBLIC_TABLES
.
Renaming from LLANA_AUTH_TABLE
to LLANA_PUBLIC_TABLES
can clarify usage but ensure references in other files are updated for consistency.
71-72
: Conditional creation of _llana_public_tables
.
The automated creation logic is a convenient setup step. Any existing external DB migrations or concurrency scenarios should be carefully handled or documented.
128-128
: Creation check for _llana_public_tables
.
Ensure that partial table creation failures or concurrency conflicts are handled (e.g., if two servers start at once).
713-713
: Warning on skipping auth.
Clear messaging for operators. Ensure the docs or environment variable references are up to date to prevent security oversights.
src/app.controller.post.test.spec.ts (11)
21-21
: Import of RolePermission
.
Neatly integrated for verifying role-based test scenarios.
36-40
: New variables customers
array and userId
.
- Consolidating multiple customers in one array simplifies test cleanup.
userId
ensures realistic scenario coverage for user-customer relationships.
76-77
: Retrieve user ID from token.
This extra step is helpful for validating user-scoped data. Keep an eye on performance if repeated frequently in large test suites.
78-90
: User creation test logic.
- Mocks creation of a user, verifying that the result has hashed credentials.
- Good approach to ensure newly created users can be used in subsequent permission checks.
103-111
: Single-customer creation test.
Correctly validates that primary key and key fields are present. Uses customers.push(result.body)
for later cleanup.
Line range hint 116-133
: Bulk-customer creation test.
- Efficiently tests array-based creation with multiple records.
- The approach of verifying
total
,errored
, andsuccessful
is a good practice for multi-record endpoints.
137-186
: Public creation tests with different permission levels.
- Thoroughly tests READ permissions blocking creation and WRITE permissions allowing it.
- The teardown with
deletePublicTablesRecord
ensures a clean environment for subsequent tests.
189-322
: Role-Based creation tests.
- Exhaustive coverage for various role combinations, including ownRecords logic.
- Great example of negative tests ensuring a 401 is returned for disallowed attempts.
347-386
: Role-based creation with multiple records.
- Checking partial failures is essential for robust multi-insert logic.
- Confirms that successful records are returned while disallowed requests are blocked, a strong step for real-world usage.
461-482
: Allowed fields coverage (single record creation).
- Thorough approach ensures fields not listed in
allowed_fields
are omitted. - This is critical for data privacy in multi-tenant or restricted data use cases.
522-572
: Allowed fields coverage (multiple record creation).
- Confirms correct filtering for all records in a bulk request.
- Another solid demonstration of partial field return logic.
src/app.controller.put.test.spec.ts (8)
25-25
: Import of RolePermission
.
Methodical approach to unify testing of role logic.
46-53
: customers
array and userId
variable.
- Centralizing references to multiple customers helps maintain clarity.
- Ensuring coverage of multi-user scenarios fosters test reliability.
108-115
: Initial login, user creation, and multiple customers creation.
- Thorough setup to test updates on various user/customer combos.
- The approach aligns with the newly introduced userId references throughout the PR.
Line range hint 134-148
: Single record update test.
- Validates correct property modifications and ensures the record ID remains stable.
- Correctly updates the local
customers
array after the request.
Line range hint 151-187
: Bulk update test.
- Checking
total
,errored
, andsuccessful
ensures robust multi-record update logic. - Good strategy to persist updated records back into
customers
.
223-273
: Public updating tests with different permission levels.
- Maintains consistency with the public table read/write approach from the POST tests.
- Solid negative test coverage (401 expected when permission is lacking).
276-434
: Role-based updates for single & multiple records.
- Systematically checks the effect of
records
vs.own_records
permissions. - Use of 401 for forbidden attempts is consistent with the rest of the codebase.
489-612
: Allowed fields coverage for PUT requests.
- Verifies that only specified fields are returned, matching the approach in POST tests.
- Ensures updates succeed while data is selectively hidden in responses.
src/app.controller.get.test.spec.ts (27)
24-25
: New imports for RolePermission and UserTestingService
These imports provide clarity for role-based permissions and user-related testing, and they look properly introduced here.
37-37
: Added variable: userTestingService
Declaring userTestingService
expands the test setup to create, fetch, and delete users. This is consistent with the new role-based tests.
45-45
: Added variable: userSchema
Storing the schema in userSchema
is a clean approach for referencing dynamic properties (like the primary key).
81-89
: Providers and Exports
Registering UserTestingService
in both providers and exports ensures it's easily accessible in other modules. Good addition.
100-100
: Retrieving UserTestingService
Acquiring userTestingService
from the Nest container is consistent with the DI pattern.
106-106
: Fetching user schema
Getting the schema for the user table is well-aligned with the approach seen for other entities.
108-109
: Obtain JWT and userId
Using authTestingService.login()
and getUserId(...)
for test setup is consistent and keeps authentication logic centralized.
111-119
: Mock user creation flow
Mocking the user, sending a POST request, and verifying a 201 response ensures the user creation process is tested end-to-end. This is thorough.
120-120
: Associate created customer with userId
Passing user[userSchema.primary_key]
into the new customer anchors the user-customer relationship. Solid approach for validating relational data.
309-310
: Verifying response body
Ensuring the result body is defined is fundamental. Straightforward check.
315-315
: Verifying string field
Checking shipName
against 'string'
helps confirm correct data shape.
321-321
: Verifying numeric field
Confirming freight
is a number is a meaningful test for data integrity.
336-342
: Public Fetch: Default public fail
This test ensures unauthenticated requests get a 401
response. Good coverage of security boundaries.
343-359
: Public Fetch with READ permissions
This verifies that records can be fetched publicly with the correct access level. Excellent demonstration of role-based access.
361-377
: Public Fetch with WRITE permissions
Further checks ensure that a READ-like operation is allowed for a table flagged with WRITE. This clarifies partial overlaps between roles.
380-386
: Role-based fetching: No table role
Verifies default behavior when no explicit role is assigned. Confirms that the route fallback logic is correct.
388-409
: Role-based fetching: DELETE table role
Shows that having a DELETE permission also implicitly grants read access to that record. Clarifies the role model.
411-432
: Role-based fetching: DELETE own records
Tests an instance of “own records only.” The 204
outcome for someone else's record is interesting; presumably, it's the chosen way to handle “not allowed or not found.”
Ensure this matches the intended UX. If partial data or “Forbidden” is expected, adjust accordingly.
434-455
: Role-based fetching: WRITE table role
Confirms that the record can be retrieved if the role has WRITE access. Shows the interplay between write vs. read.
457-478
: Role-based fetching: WRITE own records only
Similar approach but for “own records.” Ensures partial role coverage.
480-501
: Role-based fetching: READ table role
Tests if users with READ permissions can retrieve the record. Straightforward scenario.
503-524
: Role-based fetching: READ own records only
Again, tests that other users’ data is not accessible. Good coverage.
527-583
: Allowed Fields Results: Full fields
Ensures that the default scenario returns all fields. Great baseline test.
585-622
: Allowed Fields: Restricting returned fields
Only the specified fields are returned. Validates field-level permissions.
624-649
: Allowed Fields with relations
Checks that both primary object fields and related entity fields remain consistent with the role-based mask.
651-693
: Allowed Fields with userId and user relation
Verifies that deeper nested fields are also handled properly. Good approach for advanced testing.
749-749
: Deleting user after tests
Cleaning up the test data is best practice. Ensures no leftover records.
src/app.constants.ts (1)
4-4
: Renamed constant to LLANA_PUBLIC_TABLES
Switching from LLANA_AUTH_TABLE
to LLANA_PUBLIC_TABLES
clarifies public vs. private access. Consistent with the updated naming scheme across the codebase.
src/types/auth.types.ts (1)
58-58
: Extension of AuthTablePermissionSuccessResponse
Adding allowed_fields?: string[]
broadens the shape of the success response for more granular authorization control. Well-integrated with the new role-based scenario.
src/modules/websocket/websocket.service.ts (1)
20-23
: Preventing publish calls with missing id
This is a sensible check that prevents publishing invalid events. Proper use of early return to avoid overhead.
src/testing/customer.testing.service.ts (1)
46-46
: Passing customer.userId
into mockCustomer
Ensures the user-customer link is embedded in newly created mock entries. Maintains consistency with the rest of the PR’s refactoring for user-linked data.
src/helpers/Roles.ts (3)
11-11
: Clarify Environment Check Import
It’s good to see you leveraging Env
to selectively skip cache retrieval during tests. Make sure that Env.IsNotTest()
usage is tested so that code paths remain covered in both testing and production modes.
286-296
: Helpful Utility Method
formatAllowedFields
is a neat helper for consistently splitting fields. Looks clean and aids readability, especially since it trims spaces.
304-324
: Permission Comparison Function Looks Solid
comparePermissions
logic is straightforward and well-structured. Consider adding unit tests for each enumerated permission to ensure coverage.
src/helpers/Schema.ts (1)
691-695
: Clarify “Deep Relation” Error
Using “Deep Relation” helps differentiate from “Relation field.” Good for clarity. Ensure that your documentation references these distinctions so devs know which error to expect.
demo/databases/airtable.ts (1)
133-137
: New userId
Field Enhances User-Customer Relationship
Adding userId
ensures consistent relationships across your data model. Make sure to handle the case in which a record might not have a user associated, if that’s allowed.
demo/databases/mongodb.js (7)
27-31
: Establishing cross-table relationship.
These lines introduce a link between the Customer
table's userId
column and the User
table's _id
column. This effectively models a one-to-one or many-to-one relationship (depending on usage) to associate customers with users. Ensure related application logic (queries, indexes) is in place to handle this new foreign key linkage.
Line range hint 98-112
: Consistent assignment of userId
to Customer.
Including userId
in customer documents enforces the newly introduced relationship. Double-check that the user._id
is defined and updated properly at this point to avoid referencing a non-existent or stale _id
.
Line range hint 113-127
: Repeated userId
addition in subsequent documents.
Duplicating the userId: user._id
assignment for each inserted record is consistent for all documents. This ensures uniform usage of the new association. No issues found; keep it consistent with the rest of the code.
Line range hint 128-142
: User association for third customer block.
Same pattern as before, looks stable. Confirm that future logic (e.g., retrieving and updating these customers) accounts for the user relationship.
Line range hint 143-157
: Check references for possible concurrency conflicts.
If multiple users or processes can seed or update data concurrently, consider whether collisions on userId
assignments might occur. Here it appears safe, but worth verifying concurrency logic elsewhere.
Line range hint 218-232
: Maintain consistent naming conventions.
Check for uniform naming of fields across the entire codebase: custId
is sometimes referred to as _id
in other relationships. Ensure naming is consistent to avoid confusion.
Line range hint 233-247
: Validate edge cases with multiple countries/languages.
The seed data includes multiple international addresses and phone formats. Just be mindful if future logic depends on uniform input formatting.
src/app.service.auth.ts (1)
23-27
: New method for user identification from JWT.
The getUserId(jwt: string)
method provides a concise way to extract the sub
property from the token. This is appropriate for user identification. Ensure it’s used consistently to avoid logic drift between different parts of the code that decode JWTs.
demo/databases/json/Customer.json (7)
2-2
: userId
field addition for Customer #1.
The newly introduced userId: 1
ties the customer record to a user. Ensure that a user with ID 1 actually exists if a relational check or foreign key constraint is enforced in the environment.
17-17
: Repeated user association for Customer #2.
Same note on verifying that the userId
value remains consistent with your actual users. No immediate concerns, but keep references aligned.
32-32
: Customer #3 references userId = 1.
No immediate issues. Confirm that ID collisions don’t exist if more than one user uses the value 1
.
62-62
: Retaining the original structure while adding a user link.
No issues found. The rest of the fields remain unchanged. Good approach to preserving existing data while extending functionality.
77-77
: Additional user reference for Customer #6.
Everything continues to follow the same pattern. Confirm that any logic to retrieve user details expects numeric IDs if you plan to have non-numeric userIds in the future.
107-107
: Ideally ensure region format consistency.
Though unrelated to userId
, note the presence of region
fields. Some have a region as null
, others as a string (e.g., "BC"), which may cause confusion if a standard format is required.
122-122
: Ensuring correct references for international addresses.
No direct user association problem. Just a reminder that userId
is uniformly set to 1. Keep data mapping in mind for future expansions to multiple users or roles.
src/app.controller.post.ts (5)
Line range hint 1-15
: Added imports and typed definitions.
The new imports (Body
, AuthTablePermissionFailResponse
, and others) support the refactoring to incorporate role-based checks and typed request body handling. Ensure that all newly imported types are indeed utilized; otherwise, consider removing them to keep the code lean.
87-117
: Role-based checks for array inserts.
The introduction of loop-based permission checks (lines 87+) is good for validating each record in multi-insert scenarios. Ensure the loop handles large arrays efficiently and fails fast when possible if that’s desirable.
Having a running tally of successful
/errored
is a user-friendly approach, returning partial successes and robust error details.
🧰 Tools
🪛 GitHub Check: CodeQL
[failure] 100-100: Type confusion through parameter tampering
Potential type confusion as this HTTP request parameter may be either an array or a string.
148-173
: Role-based checks for single-record inserts.
The structure mirrors the multi-insert logic, ensuring consistency. Good approach. Also note that if the table is eventually made public, the auth.user_identifier
path remains bypassed. This might cause confusion for certain user flows. Confirm that logic is consistent with your overall security model.
185-188
: New createOneRecord
method signature updates.
Passing fields: string[]
as a separate parameter is a good approach to controlling which fields are allowed. Double-check that sensitive fields are excluded effectively. The narrower scope helps maintain separation of concerns between user-level permission checks and data insertion logic.
196-196
: Validating input data using schema rules.
Ensuring data validity before creation is a cornerstone of secure and robust code. Good use of schema.validateData
. Confirm it covers edge cases such as unexpected data types or missing required fields.
demo/databases/mysql.sql (1)
50-50
: LGTM! Appropriate foreign key constraints
The foreign key constraint uses ON UPDATE RESTRICT
, which is a safer choice for maintaining referential integrity.
Also applies to: 67-67
demo/databases/postgres.sql (1)
47-47
: Verify impact of email nullability change
The email
column in both Customer and Employee tables has been changed from NOT NULL
to NULL
. This change in constraint could affect existing applications that assume email is always present.
Run this script to check for potential impact:
Also applies to: 68-68
✅ Verification successful
Let me gather more information about the email field usage in the schema and validation layers.
Let me check how the email field is used in the application code.
Based on the analysis of the code and database schema, I can now provide the final response:
Email nullability change appears safe to implement
The change to make email nullable in Customer and Employee tables appears safe because:
-
No validation constraints or business logic require email to be present:
- No
@IsEmail
or@IsNotEmpty
decorators found for email fields - No explicit email validation checks in application code
- Email is only used in test data generation with faker and basic field existence checks in tests
- No
-
The application code treats email as an optional field:
- Test assertions only check if email is "defined" when present
- No code paths depend on email being non-null
- Email is used primarily for display/reference purposes
-
The User table (which handles authentication) maintains its own separate email field with NOT NULL constraint, so this change doesn't affect core authentication functionality.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Check for direct email field usage in the codebase
echo "Searching for email field usage in code..."
rg -A 3 "\.email" --type ts --type js
echo "Searching for email validation or requirement checks..."
rg -A 3 "email.*required|required.*email" --type ts --type js
Length of output: 3956
Script:
#!/bin/bash
# Check for email-related validations in schema files
echo "Searching for email in schema definitions..."
fd -e ts -e js -e json --full-path "schema" -x cat {}
# Check for email in validation files
echo "Searching for email in validation files..."
fd -e ts -e js --full-path "valid" -x cat {}
# Check for email in database migration files
echo "Searching for email in SQL files..."
fd -e sql -x cat {}
Length of output: 122564
Script:
#!/bin/bash
# Search for email validation or requirement checks in the application code
echo "Searching for email validation patterns..."
ast-grep --pattern 'email.*required|required.*email|@IsEmail|@IsNotEmpty.*email|email.*@IsNotEmpty'
# Search for email usage in database operations
echo "Searching for email in database operations..."
rg -A 3 "\.email|email.*=" --type ts --type js
Length of output: 3983
async deletePublicTablesRecord(data: any): Promise<void> { | ||
const schema = await this.schema.getSchema({ table: LLANA_PUBLIC_TABLES, x_request_id: 'test' }) | ||
await this.query.perform(QueryPerform.DELETE, { | ||
id: data[schema.primary_key], | ||
schema, | ||
}) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add error handling for schema retrieval failure
The method should handle potential errors when retrieving the schema.
async deletePublicTablesRecord(data: any): Promise<void> {
+ try {
const schema = await this.schema.getSchema({ table: LLANA_PUBLIC_TABLES, x_request_id: 'test' })
+ if (!schema) {
+ throw new Error('Failed to retrieve schema for public tables')
+ }
await this.query.perform(QueryPerform.DELETE, {
id: data[schema.primary_key],
schema,
})
+ } catch (error) {
+ throw new Error(`Failed to delete public tables record: ${error.message}`)
+ }
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async deletePublicTablesRecord(data: any): Promise<void> { | |
const schema = await this.schema.getSchema({ table: LLANA_PUBLIC_TABLES, x_request_id: 'test' }) | |
await this.query.perform(QueryPerform.DELETE, { | |
id: data[schema.primary_key], | |
schema, | |
}) | |
} | |
async deletePublicTablesRecord(data: any): Promise<void> { | |
try { | |
const schema = await this.schema.getSchema({ table: LLANA_PUBLIC_TABLES, x_request_id: 'test' }) | |
if (!schema) { | |
throw new Error('Failed to retrieve schema for public tables') | |
} | |
await this.query.perform(QueryPerform.DELETE, { | |
id: data[schema.primary_key], | |
schema, | |
}) | |
} catch (error) { | |
throw new Error(`Failed to delete public tables record: ${error.message}`) | |
} | |
} |
async deleteRole(data: any): Promise<void> { | ||
const schema = await this.schema.getSchema({ table: LLANA_ROLES_TABLE, x_request_id: 'test' }) | ||
await this.query.perform(QueryPerform.DELETE, { | ||
id: data[schema.primary_key], | ||
schema, | ||
}) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add error handling for deleteRole method
The method should handle potential errors and validate input.
async deleteRole(data: any): Promise<void> {
+ if (!data) {
+ throw new Error('Role data is required')
+ }
+ try {
const schema = await this.schema.getSchema({ table: LLANA_ROLES_TABLE, x_request_id: 'test' })
+ if (!schema) {
+ throw new Error('Failed to retrieve schema for roles table')
+ }
await this.query.perform(QueryPerform.DELETE, {
id: data[schema.primary_key],
schema,
})
+ } catch (error) {
+ throw new Error(`Failed to delete role: ${error.message}`)
+ }
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async deleteRole(data: any): Promise<void> { | |
const schema = await this.schema.getSchema({ table: LLANA_ROLES_TABLE, x_request_id: 'test' }) | |
await this.query.perform(QueryPerform.DELETE, { | |
id: data[schema.primary_key], | |
schema, | |
}) | |
} | |
async deleteRole(data: any): Promise<void> { | |
if (!data) { | |
throw new Error('Role data is required') | |
} | |
try { | |
const schema = await this.schema.getSchema({ table: LLANA_ROLES_TABLE, x_request_id: 'test' }) | |
if (!schema) { | |
throw new Error('Failed to retrieve schema for roles table') | |
} | |
await this.query.perform(QueryPerform.DELETE, { | |
id: data[schema.primary_key], | |
schema, | |
}) | |
} catch (error) { | |
throw new Error(`Failed to delete role: ${error.message}`) | |
} | |
} |
async createRole(data: { | ||
custom: boolean | ||
table: string | ||
identity_column?: string | ||
role: string | ||
records: RolePermission | ||
own_records: RolePermission | ||
allowed_fields?: string | ||
}): Promise<FindOneResponseObject> { | ||
const schema = await this.schema.getSchema({ table: LLANA_ROLES_TABLE, x_request_id: 'test' }) | ||
return (await this.query.perform(QueryPerform.CREATE, <DataSourceCreateOneOptions>{ | ||
schema, | ||
data, | ||
})) as FindOneResponseObject | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Enhance type safety and validation for createRole method
The method accepts several optional parameters without proper validation.
async createRole(data: {
custom: boolean
table: string
identity_column?: string
role: string
records: RolePermission
own_records: RolePermission
allowed_fields?: string
}): Promise<FindOneResponseObject> {
+ // Validate required fields
+ if (!data.table?.trim()) {
+ throw new Error('Table name is required')
+ }
+ if (!data.role?.trim()) {
+ throw new Error('Role name is required')
+ }
+ if (!Object.values(RolePermission).includes(data.records)) {
+ throw new Error('Invalid records permission')
+ }
+ if (!Object.values(RolePermission).includes(data.own_records)) {
+ throw new Error('Invalid own_records permission')
+ }
const schema = await this.schema.getSchema({ table: LLANA_ROLES_TABLE, x_request_id: 'test' })
return (await this.query.perform(QueryPerform.CREATE, <DataSourceCreateOneOptions>{
schema,
data,
})) as FindOneResponseObject
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async createRole(data: { | |
custom: boolean | |
table: string | |
identity_column?: string | |
role: string | |
records: RolePermission | |
own_records: RolePermission | |
allowed_fields?: string | |
}): Promise<FindOneResponseObject> { | |
const schema = await this.schema.getSchema({ table: LLANA_ROLES_TABLE, x_request_id: 'test' }) | |
return (await this.query.perform(QueryPerform.CREATE, <DataSourceCreateOneOptions>{ | |
schema, | |
data, | |
})) as FindOneResponseObject | |
} | |
async createRole(data: { | |
custom: boolean | |
table: string | |
identity_column?: string | |
role: string | |
records: RolePermission | |
own_records: RolePermission | |
allowed_fields?: string | |
}): Promise<FindOneResponseObject> { | |
// Validate required fields | |
if (!data.table?.trim()) { | |
throw new Error('Table name is required') | |
} | |
if (!data.role?.trim()) { | |
throw new Error('Role name is required') | |
} | |
if (!Object.values(RolePermission).includes(data.records)) { | |
throw new Error('Invalid records permission') | |
} | |
if (!Object.values(RolePermission).includes(data.own_records)) { | |
throw new Error('Invalid own_records permission') | |
} | |
const schema = await this.schema.getSchema({ table: LLANA_ROLES_TABLE, x_request_id: 'test' }) | |
return (await this.query.perform(QueryPerform.CREATE, <DataSourceCreateOneOptions>{ | |
schema, | |
data, | |
})) as FindOneResponseObject | |
} |
if (fields.length) { | ||
const filtered = {} | ||
for (const field of fields) { | ||
filtered[field] = result[field] | ||
} | ||
return res.status(200).send(filtered) | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add field filtering validation
Ensure that the filtered fields are actually present in the schema.
if (fields.length) {
const filtered = {}
+ const schemaFields = new Set(schema.columns.map(col => col.field))
for (const field of fields) {
+ if (!schemaFields.has(field)) {
+ this.logger.warn(`Filtered field ${field} not found in schema`)
+ continue
+ }
filtered[field] = result[field]
}
return res.status(200).send(filtered)
}
Committable suggestion skipped: line range outside the PR's diff.
skipAuth(): boolean { | ||
const skipAuth = this.configService.get('SKIP_AUTH') | ||
// Only skip if explicitly set to 'true' string | ||
const shouldSkip = skipAuth === 'true' | ||
if (shouldSkip) { | ||
this.logger.debug(`[Authentication][auth] Skipping authentication due to SKIP_AUTH being true`) | ||
} | ||
return shouldSkip | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Enhance skipAuth security
The current implementation of skipAuth could be more secure.
skipAuth(): boolean {
const skipAuth = this.configService.get('SKIP_AUTH')
- // Only skip if explicitly set to 'true' string
- const shouldSkip = skipAuth === 'true'
+ // Only skip if explicitly set to 'true' string and not in production
+ const isProduction = this.configService.get('NODE_ENV') === 'production'
+ const shouldSkip = !isProduction && skipAuth === 'true'
if (shouldSkip) {
- this.logger.debug(`[Authentication][auth] Skipping authentication due to SKIP_AUTH being true`)
+ this.logger.warn(`[Authentication][auth] Skipping authentication in non-production environment`)
}
return shouldSkip
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
skipAuth(): boolean { | |
const skipAuth = this.configService.get('SKIP_AUTH') | |
// Only skip if explicitly set to 'true' string | |
const shouldSkip = skipAuth === 'true' | |
if (shouldSkip) { | |
this.logger.debug(`[Authentication][auth] Skipping authentication due to SKIP_AUTH being true`) | |
} | |
return shouldSkip | |
} | |
skipAuth(): boolean { | |
const skipAuth = this.configService.get('SKIP_AUTH') | |
// Only skip if explicitly set to 'true' string and not in production | |
const isProduction = this.configService.get('NODE_ENV') === 'production' | |
const shouldSkip = !isProduction && skipAuth === 'true' | |
if (shouldSkip) { | |
this.logger.warn(`[Authentication][auth] Skipping authentication in non-production environment`) | |
} | |
return shouldSkip | |
} |
for (let customer of customers) { | ||
await customerTestingService.deleteCustomer(customer[customerSchema.primary_key]) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add error handling for cleanup in afterAll
The cleanup process should handle potential errors gracefully.
afterAll(async () => {
+ const cleanupErrors = []
for (let customer of customers) {
- await customerTestingService.deleteCustomer(customer[customerSchema.primary_key])
+ try {
+ await customerTestingService.deleteCustomer(customer[customerSchema.primary_key])
+ } catch (error) {
+ cleanupErrors.push(`Failed to delete customer ${customer[customerSchema.primary_key]}: ${error.message}`)
+ }
}
+ if (cleanupErrors.length > 0) {
+ console.error('Cleanup errors:', cleanupErrors)
+ }
await app.close()
})
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
for (let customer of customers) { | |
await customerTestingService.deleteCustomer(customer[customerSchema.primary_key]) | |
} | |
afterAll(async () => { | |
const cleanupErrors = [] | |
for (let customer of customers) { | |
try { | |
await customerTestingService.deleteCustomer(customer[customerSchema.primary_key]) | |
} catch (error) { | |
cleanupErrors.push(`Failed to delete customer ${customer[customerSchema.primary_key]}: ${error.message}`) | |
} | |
} | |
if (cleanupErrors.length > 0) { | |
console.error('Cleanup errors:', cleanupErrors) | |
} | |
await app.close() | |
}) |
if (options.fields?.length && !options.fields.includes(options.schema.primary_key)) { | ||
delete result[0][options.schema.primary_key] | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
🛠️ Refactor suggestion
Protect Against Missing Results - Confirmed Issue
Looking at the code context, I can see that result[0]
is accessed multiple times after the MongoDB query, not just for deleting the primary key but also for logging and in formatOutput
. The current implementation doesn't check if the result array has any elements before accessing index 0, which could lead to runtime errors if the query returns no results.
src/datasources/mongo.datasource.ts
: Multiple unsafe accesses ofresult[0]
without length check:delete result[0][options.schema.primary_key] this.logger.debug(`[${DATABASE_TYPE}] Result: ${JSON.stringify(result[0])}`, x_request_id) return this.formatOutput(options, result[0])
Notably, other datasources like Postgres have proper null checks (e.g., if (!result[0]) { return }
), which makes this oversight in the MongoDB datasource more apparent.
🔗 Analysis chain
Protect Against Missing Results
When you delete the primary key from result[0]
, ensure result[0]
actually exists. If the query returns an empty array, this could throw a runtime error.
const result = await mongo.collection.find(mongoFilters).project(mongoFields).limit(1).toArray()
-if (options.fields?.length && !options.fields.includes(options.schema.primary_key)) {
- delete result[0][options.schema.primary_key]
-}
+if (result.length > 0 && options.fields?.length && !options.fields.includes(options.schema.primary_key)) {
+ delete result[0][options.schema.primary_key]
+}
Let me verify this by checking the surrounding context and error handling patterns in the codebase.
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for the function containing this code to understand the context
ast-grep --pattern 'async function $FUNC($$$) {
$$$
const result = await mongo.collection.find($$$).project($$$).limit(1).toArray()
$$$
}'
# Also search for similar patterns of array access to see error handling
rg "result\[0\]" -A 3 -B 3
Length of output: 4488
@Body() body: Partial<any> | Partial<any>[], | ||
): Promise<FindOneResponseObject | CreateManyResponseObject> { | ||
const x_request_id = headers['x-request-id'] | ||
let table_name = UrlToTable(req.originalUrl, 1) | ||
|
||
if (table_name === 'webhook') { | ||
table_name = LLANA_WEBHOOK_TABLE | ||
|
||
//perform auth on webhook table | ||
const auth = await this.authentication.auth({ | ||
table: req.body.table, | ||
x_request_id, | ||
access: RolePermission.READ, | ||
headers: req.headers, | ||
body: req.body, | ||
query: req.query, | ||
}) | ||
if (!auth.valid) { | ||
return res.status(401).send(auth.message) | ||
} | ||
|
||
//perform role check | ||
if (auth.user_identifier) { | ||
const { valid, message } = (await this.roles.tablePermission({ | ||
identifier: auth.user_identifier, | ||
table: req.body.table, | ||
access: RolePermission.READ, | ||
x_request_id, | ||
})) as AuthTablePermissionFailResponse | ||
|
||
if (!valid) { | ||
return res.status(401).send(this.response.text(message)) | ||
} | ||
} | ||
|
||
if (!req.body.user_identifier) { | ||
req.body.user_identifier = auth.user_identifier | ||
} | ||
} | ||
|
||
const body = req.body | ||
|
||
const options: DataSourceCreateOneOptions = { | ||
schema: null, | ||
data: {}, | ||
} | ||
let schema: DataSourceSchema | ||
let fields = [] | ||
|
||
try { | ||
options.schema = await this.schema.getSchema({ table: table_name, x_request_id }) | ||
schema = await this.schema.getSchema({ table: table_name, x_request_id }) | ||
} catch (e) { | ||
return res.status(404).send(this.response.text(e.message)) | ||
} | ||
|
||
const auth = await this.authentication.auth({ | ||
// Is the table public? | ||
let auth = await this.authentication.public({ | ||
table: table_name, | ||
access_level: RolePermission.WRITE, | ||
x_request_id, | ||
access: RolePermission.WRITE, | ||
headers: req.headers, | ||
body: req.body, | ||
query: req.query, | ||
}) | ||
if (!auth.valid) { | ||
return res.status(401).send(auth.message) | ||
} | ||
|
||
//perform role check | ||
if (auth.user_identifier) { | ||
const { valid, message } = (await this.roles.tablePermission({ | ||
identifier: auth.user_identifier, | ||
// If not public, perform auth | ||
if (!auth.valid) { | ||
auth = await this.authentication.auth({ | ||
table: table_name, | ||
access: RolePermission.WRITE, | ||
x_request_id, | ||
})) as AuthTablePermissionFailResponse | ||
|
||
if (!valid) { | ||
return res.status(401).send(this.response.text(message)) | ||
access: RolePermission.WRITE, | ||
headers: req.headers, | ||
body: req.body, | ||
query: req.query, | ||
}) | ||
if (!auth.valid) { | ||
return res.status(401).send(this.response.text(auth.message)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Refactoring to handle @Body()
and public vs. private table logic.
- The addition of
@Body()
allows simpler request body extraction, but watch for potential type confusion if an attacker sends arrays vs. objects. - Good logic flow: first check if the table is public, then proceed with authentication if needed.
- Consider input validation for
body
to avoid unexpected shapes or malicious data.
Here is how you could guard the parameter against type confusion:
- @Body() body: Partial<any> | Partial<any>[],
+ @Body() body: Partial<Record<string, any>> | Partial<Record<string, any>>[],
This ensures a more explicit type. Evaluate usage to confirm you’re handling arrays correctly.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
@Body() body: Partial<any> | Partial<any>[], | |
): Promise<FindOneResponseObject | CreateManyResponseObject> { | |
const x_request_id = headers['x-request-id'] | |
let table_name = UrlToTable(req.originalUrl, 1) | |
if (table_name === 'webhook') { | |
table_name = LLANA_WEBHOOK_TABLE | |
//perform auth on webhook table | |
const auth = await this.authentication.auth({ | |
table: req.body.table, | |
x_request_id, | |
access: RolePermission.READ, | |
headers: req.headers, | |
body: req.body, | |
query: req.query, | |
}) | |
if (!auth.valid) { | |
return res.status(401).send(auth.message) | |
} | |
//perform role check | |
if (auth.user_identifier) { | |
const { valid, message } = (await this.roles.tablePermission({ | |
identifier: auth.user_identifier, | |
table: req.body.table, | |
access: RolePermission.READ, | |
x_request_id, | |
})) as AuthTablePermissionFailResponse | |
if (!valid) { | |
return res.status(401).send(this.response.text(message)) | |
} | |
} | |
if (!req.body.user_identifier) { | |
req.body.user_identifier = auth.user_identifier | |
} | |
} | |
const body = req.body | |
const options: DataSourceCreateOneOptions = { | |
schema: null, | |
data: {}, | |
} | |
let schema: DataSourceSchema | |
let fields = [] | |
try { | |
options.schema = await this.schema.getSchema({ table: table_name, x_request_id }) | |
schema = await this.schema.getSchema({ table: table_name, x_request_id }) | |
} catch (e) { | |
return res.status(404).send(this.response.text(e.message)) | |
} | |
const auth = await this.authentication.auth({ | |
// Is the table public? | |
let auth = await this.authentication.public({ | |
table: table_name, | |
access_level: RolePermission.WRITE, | |
x_request_id, | |
access: RolePermission.WRITE, | |
headers: req.headers, | |
body: req.body, | |
query: req.query, | |
}) | |
if (!auth.valid) { | |
return res.status(401).send(auth.message) | |
} | |
//perform role check | |
if (auth.user_identifier) { | |
const { valid, message } = (await this.roles.tablePermission({ | |
identifier: auth.user_identifier, | |
// If not public, perform auth | |
if (!auth.valid) { | |
auth = await this.authentication.auth({ | |
table: table_name, | |
access: RolePermission.WRITE, | |
x_request_id, | |
})) as AuthTablePermissionFailResponse | |
if (!valid) { | |
return res.status(401).send(this.response.text(message)) | |
access: RolePermission.WRITE, | |
headers: req.headers, | |
body: req.body, | |
query: req.query, | |
}) | |
if (!auth.valid) { | |
return res.status(401).send(this.response.text(auth.message)) | |
@Body() body: Partial<Record<string, any>> | Partial<Record<string, any>>[], | |
): Promise<FindOneResponseObject | CreateManyResponseObject> { | |
const x_request_id = headers['x-request-id'] | |
let table_name = UrlToTable(req.originalUrl, 1) | |
if (table_name === 'webhook') { | |
table_name = LLANA_WEBHOOK_TABLE | |
} | |
let schema: DataSourceSchema | |
let fields = [] | |
try { | |
schema = await this.schema.getSchema({ table: table_name, x_request_id }) | |
} catch (e) { | |
return res.status(404).send(this.response.text(e.message)) | |
} | |
// Is the table public? | |
let auth = await this.authentication.public({ | |
table: table_name, | |
access_level: RolePermission.WRITE, | |
x_request_id, | |
}) | |
// If not public, perform auth | |
if (!auth.valid) { | |
auth = await this.authentication.auth({ | |
table: table_name, | |
x_request_id, | |
access: RolePermission.WRITE, | |
headers: req.headers, | |
body: req.body, | |
query: req.query, | |
}) | |
if (!auth.valid) { | |
return res.status(401).send(this.response.text(auth.message)) |
@@ -46,6 +46,7 @@ SET IDENTITY_INSERT UserApiKey OFF; | |||
|
|||
CREATE TABLE Customer ( | |||
custId INT IDENTITY NOT NULL | |||
,userId int NOT NULL |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Review CASCADE DELETE behavior on Customer-User relationship
The foreign key constraint uses ON DELETE CASCADE
, which automatically deletes all customer records when their associated user is deleted. This could lead to unintended data loss, especially in scenarios where user accounts are temporarily deactivated.
Consider using ON DELETE RESTRICT
or implementing a soft delete pattern instead:
- CONSTRAINT CustomerUserId FOREIGN KEY (userId) REFERENCES [User] (id) ON DELETE CASCADE ON UPDATE NO ACTION
+ CONSTRAINT CustomerUserId FOREIGN KEY (userId) REFERENCES [User] (id) ON DELETE RESTRICT ON UPDATE NO ACTION
Also applies to: 66-66
Summary by CodeRabbit
New Features
userId
field to theCustomer
table across various database implementations.Supplier
table in the PostgreSQL schema.AuthService
with a method to extract user ID from JWT.AuthTestingService
.Bug Fixes
Documentation
Tests
Refactor
Schema
andQuery
classes.