-
Notifications
You must be signed in to change notification settings - Fork 231
Feature/add firo network #584
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
Feature/add firo network #584
Conversation
WalkthroughAdds Firo network support and Spark (Zcoin) integration across network configs, Electrum client, Firo wallet layers, Spark WASM tooling and handlers, send UI tabs + verification flows, background/keyring wiring, wasm loaders, anonymity/tag sync utilities, types, and tests. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant UI as User UI (SparkSendTab)
participant Verify as Verify View
participant Handler as Spark Handler (JS)
participant WASM as Spark WASM
participant Wallet as PublicFiroWallet
participant Electrum as FiroElectrum
UI->>Verify: user submits recipient+amount
Verify->>Handler: sendFromSparkAddress(network,to,amount)
Handler->>WASM: init & ccall helpers (deserialize, mint/recipients)
Handler->>Wallet: getOnlySpendableUtxos()
Wallet->>Electrum: fetch UTXOs / anonymity data
Electrum-->>Wallet: UTXO / meta results
Wallet-->>Handler: spendable UTXOs
Handler->>WASM: createSparkSpend(js_createSparkSpend)
WASM-->>Handler: serialized spend & outputs
Handler->>Verify: return assembled tx hex
Verify->>Electrum: broadcast(txHex)
Electrum-->>Verify: txid / result
Verify->>UI: display success or error
Estimated code review effort🎯 5 (Critical) | ⏱️ ~120 minutes Possibly related PRs
Suggested reviewers
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
2a1b0bc to
a79031b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 8
🧹 Nitpick comments (36)
packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue (3)
11-31: Use a consistent fallback strategy when Spark account is unavailable.This newly introduced tab-based interface is conditionally rendered for Firo and Spark. The logic is straightforward, but consider providing a clear fallback mechanism when Spark is unavailable or the user rapidly toggles between tabs. This can help avoid edge cases and potential confusion for users when they do not have a Spark account set up.
78-78: Typed approach for selectedSendTab is clean.Defining "transparent" | "spark" in a ref is succinct and ensures type safety. Consider documenting the default "transparent" state if you foresee adding more tabs down the line.
115-153: Styling is cohesive and ensures good UX.The added CSS for .send-transaction__tabs and .send-transaction__tabs-active is consistent with the project’s style. If the design evolves to include additional states, consider using BEM or a theming approach for easier maintainability.
packages/extension/src/providers/bitcoin/libs/api-firo.ts (3)
27-29: Address derivation looks correct.getBitcoinAddress is used correctly. Consider optional chaining if there's any chance pubkey conversions might fail, but it seems safe here.
31-32: init stub left empty.You might consider adding logs or clarifying comments if future initialization steps are planned.
74-87: Balance computation is straightforward.Adding confirmed and unconfirmed satoshi values is correct. Catching network fetch errors by returning "0" is safe, though consider logging partial errors for debugging.
packages/extension/src/providers/bitcoin/networks/index.ts (1)
11-12: Export object consistency.Listing firoTest and firo among existing networks is logical. Double-check naming for uniformity, as you import firoTestnet but export it as firoTest.
packages/extension/src/libs/spark-handler/callRPC.ts (1)
3-3: Avoid hardcoding RPC endpoint.
Storing the RPC URL within the source code can be problematic for future environment changes or sensitive endpoint usage. Consider externalizing this URL (e.g., environment variable or config file) for better flexibility and maintainability.packages/extension/src/libs/spark-handler/index.ts (2)
22-28: Add targeted error handling.
The function calls callRPC and directly returns its promise. If the RPC fails or returns an invalid result, the caller won't perform specific handling. Consider adding logs or structured error responses for easier debugging.
30-43: Consider centralizing transaction logic.
Both sendToSparkAddress and sendFromSparkAddress share a similar pattern of calling RPC methods with a transaction object. If additional transaction flows arise or more robust orchestration is needed, centralizing shared logic or error handling in one place may improve maintainability.packages/extension/src/providers/bitcoin/networks/firo.ts (1)
3-6: Confirm naming consistency.
The file references “BitcoinNetwork” and “BitcoinNetworkOptions” even though they represent a Firo configuration. This is likely a shared interface for all Bitcoin-like networks, but ensure the naming remains clear to future maintainers.packages/extension/src/providers/bitcoin/networks/firo-testnet.ts (1)
3-6: Explicitly differentiate from mainnet.
While it is clear from the variable names and URLs that this is a testnet configuration, consider documenting any differences in chain parameters compared to mainnet for clarity (e.g., purpose, usage disclaimers, etc.).packages/extension/src/ui/action/components/accounts-header/index.vue (1)
31-31: Confirm event handling in deposit component.
This second emission of "action:generate-new-spark" is consistent, but ensure that deposit properly handles all scenarios if the spark address is regenerated multiple times.packages/extension/src/providers/bitcoin/libs/activity-handlers/providers/firo/index.ts (2)
20-23: Avoid assignment within an expression.The assignment to "tokenPrice" within your .then() chain may trigger linter or readability warnings (as highlighted by static analysis). It's clearer to separate this assignment from the return expression:
Proposed snippet:
- .then((mdata) => (tokenPrice = mdata || "0")) + .then((mdata) => { + tokenPrice = mdata || "0"; + })🧰 Tools
🪛 Biome (1.9.4)
[error] 23-23: The assignment should not be in an expression.
The use of assignments in expressions is confusing.
Expressions are often considered as side-effect free.(lint/suspicious/noAssignInExpressions)
15-36: Consider using async/await syntax for readability.Currently, fetch and subsequent transformations rely on nested .then() calls. Although functionally correct, switching to async/await often achieves better readability and easier error handling.
🧰 Tools
🪛 Biome (1.9.4)
[error] 23-23: The assignment should not be in an expression.
The use of assignments in expressions is confusing.
Expressions are often considered as side-effect free.(lint/suspicious/noAssignInExpressions)
packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue (1)
44-47: Use the async Clipboard API for paste operations.The use of "document.execCommand('paste')" is deprecated. For improved reliability and forward-compatibility, consider leveraging the modern asynchronous clipboard API (navigator.clipboard.readText()).
- document.execCommand("paste"); + if (navigator.clipboard?.readText) { + navigator.clipboard.readText().then(text => { + addressInput.value!.value = text; + address.value = text; + }); + }packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue (1)
34-45: Consolidate dust threshold checks and negative balance checks.Multiple conditions check either "nativeBalanceAfterTransaction.isNeg()" or that "sendAmount < dust". Consider unifying or wrapping these validations into one helper function for better readability and maintainability.
packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue (2)
153-156: Validate address & amount before sending.Currently, the code calls sendFromSparkAddress unconditionally. Consider adding stronger pre-checks for valid addresses, amounts, and potential out-of-range values to prevent accidental failures or partial transactions.
189-205: Improve user-facing error messages.When an error occurs, only raw JSON is displayed. Provide a more user-friendly message or parse the API error details so users can better understand and correct the issue, rather than just seeing raw JSON strings.
packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue (1)
189-207: Surface clearer error messages to users.Similar to the "verify-send-from-spark-transaction" file, the catch block logs an error but sets errorMsg to JSON. Provide human-readable explanations or parse error details for improved clarity.
packages/extension/src/ui/action/views/deposit/index.vue (8)
16-28: Introduce tab-based UI for transparent vs. Spark address.
Adding the tab buttons for "Transparent address" and "Spark address" provides a clearer user experience. This is a good approach, but ensure that any transitions or toggles between these tabs are tested for edge cases (e.g., switching tabs rapidly while data is loading).
72-116: Add Spark address functionality.
The new Spark address logic cleanly parallels the existing transparent address section, including QR codes and copy functionality. Recommend verifying that the “Generate new address” flow is disabled while a new address is being fetched to prevent duplicate requests.
Line range hint
123-133: Check for potential unused imports.
While importing “SparkAccount” and other types is crucial, confirm whether all imports are actually in use. If not, remove them to keep the code clean.
185-187: toggleGeneratedNotification.
Simple toggling approach is fine. Consider a short timeout or some user-friendly acknowledgement if the action is repeated quickly.
238-243: deposit__generate style block.
Centering the generate button is good for UX. Double-check styling across various screen resolutions to ensure the layout remains consistent.
275-279: deposit__tabs container styling.
Proper spacing (gap: 10px) is beneficial for readability. Consider media queries if the tabs might overflow on smaller containers.
281-312: Hover state for deposit__tabs-active.
Ensuring an obvious hover state improves UX. No clear issues here, but be mindful of any conflicts with focus styles for accessibility.
446-448: Remove forced max-width override if no longer needed.
Be cautious with "!important" usage. If another part of the code depends on limiting max-width, consider a more localized approach.packages/extension/src/ui/action/App.vue (2)
283-286: updateGradient for app menu.
The gradient update approach is effective. If theming changes become more robust in the future, consider a centralized theme manager rather than direct style mutations.
297-301: getSparkAccountState method.
Fetching the Spark account data on demand is solid. Confirm that any additional networks requiring similar logic do not require duplicative code.packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue (5)
25-56: Conditional rendering for Spark vs. standard address.
This is a neat approach. If the user changes the network from Firo to another halfway through, ensure the UI updates correctly and discards the Firo-specific components to avoid any leftover data.
64-71: NFT selection logic.
Hiding NFT components if sending Spark is appropriate. Check if a user might want to handle an NFT while also having a Spark address in the same transaction flow. Currently you separate them, which likely simplifies logic.
82-122: Input amount, fee select, and final alert checks.
Combining these fields is good for user clarity. Ensure minimal rendering overhead by controlling watchers properly, since many reactive computations (balance checks, dust checks) run simultaneously.
143-575: Core script logic for transaction preparation.
Overall, the composition of advanced transaction flows (UTXOs, NFTs, Spark addresses) is well-organized. Watch for potential performance hits as the codebase grows—some of these computations (like scanning all UTXOs on every field update) may need caching or memoization.
577-662: Scoped styles for the send tab.
The layered class naming is good. Validate that any new color variables (e.g., “@error”) match the global theme to maintain consistent brand identity.README.md (1)
111-111: Consider adding Firo network detailsWhile adding Firo to the list is correct, consider providing more context about its unique features (e.g., privacy features, Spark addresses) to help users understand its capabilities.
Consider expanding the entry like this:
- - Firo + - Firo (Privacy-focused blockchain with Spark private payment support)
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
packages/extension/src/providers/bitcoin/networks/icons/firo.svgis excluded by!**/*.svg
📒 Files selected for processing (30)
.github/workflows/release.yml(1 hunks).gitignore(1 hunks)README.md(1 hunks)packages/extension/src/libs/keyring/public-keyring.ts(1 hunks)packages/extension/src/libs/spark-handler/callRPC.ts(1 hunks)packages/extension/src/libs/spark-handler/index.ts(1 hunks)packages/extension/src/providers/bitcoin/libs/activity-handlers/index.ts(1 hunks)packages/extension/src/providers/bitcoin/libs/activity-handlers/providers/firo/index.ts(1 hunks)packages/extension/src/providers/bitcoin/libs/api-firo.ts(1 hunks)packages/extension/src/providers/bitcoin/libs/utils.ts(1 hunks)packages/extension/src/providers/bitcoin/networks/firo-testnet.ts(1 hunks)packages/extension/src/providers/bitcoin/networks/firo.ts(1 hunks)packages/extension/src/providers/bitcoin/networks/index.ts(1 hunks)packages/extension/src/providers/bitcoin/types/bitcoin-network.ts(2 hunks)packages/extension/src/providers/bitcoin/types/index.ts(3 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue(1 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue(4 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue(1 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue(1 hunks)packages/extension/src/ui/action/App.vue(7 hunks)packages/extension/src/ui/action/components/accounts-header/index.vue(3 hunks)packages/extension/src/ui/action/router/index.ts(2 hunks)packages/extension/src/ui/action/types/account.ts(1 hunks)packages/extension/src/ui/action/views/deposit/index.vue(8 hunks)packages/extension/src/ui/action/views/network-activity/index.vue(1 hunks)packages/extension/src/ui/action/views/network-assets/index.vue(1 hunks)packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue(1 hunks)packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue(1 hunks)packages/hw-wallets/src/ledger/bitcoin/configs.ts(1 hunks)packages/types/src/networks.ts(1 hunks)
✅ Files skipped from review due to trivial changes (2)
- .gitignore
- packages/hw-wallets/src/ledger/bitcoin/configs.ts
🧰 Additional context used
🪛 Biome (1.9.4)
packages/extension/src/providers/bitcoin/libs/activity-handlers/providers/firo/index.ts
[error] 23-23: The assignment should not be in an expression.
The use of assignments in expressions is confusing.
Expressions are often considered as side-effect free.
(lint/suspicious/noAssignInExpressions)
packages/extension/src/providers/bitcoin/libs/api-firo.ts
[error] 53-53: Change to an optional chain.
Unsafe fix: Change to an optional chain.
(lint/complexity/useOptionalChain)
[error] 60-60: Change to an optional chain.
Unsafe fix: Change to an optional chain.
(lint/complexity/useOptionalChain)
🔇 Additional comments (56)
packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue (4)
40-45: Spark address tab usage looks solid.
The spark-send-tab usage with fallback to null for sparkAccount appears robust. Confirm that error-handling logic for Spark account operations is in place, particularly if sparkAccount fails to load or is in an invalid state.
52-61: Imports align well with project structure.
TransparentSendTab and SparkSendTab imports are straightforward and follow naming conventions. No issues found here.
87-88: Tab-switching method is clear and concise.
setSelectedSendTab cleanly updates the reactive ref. No concurrency or side-effect concerns.
33-37: Ensure consistent validation for transparent addresses.
The transparent-send-tab is correctly gated by selectedSendTab. However, ensure that any address validation logic for transparent addresses is consistently applied (e.g., input checks, format checks).
✅ Verification successful
Let me search for address validation logic in the codebase to verify the consistency of transparent address validation.
Let me search for the address validation components and their implementation to verify the consistency.
Based on the analysis of the code, I can now provide a final response regarding the address validation concern:
Address validation is consistently implemented for transparent addresses
The codebase shows a robust and consistent implementation of address validation for transparent addresses through:
- The
send-address-inputcomponent which consistently validates addresses and provides visual feedback (red color for invalid addresses) - The
isAddressutility function inbitcoin/libs/utils.tsthat properly validates addresses against the network info - The transparent-send-tab component correctly uses these validation mechanisms before allowing transactions
The validation is properly integrated into the UI flow with:
- Visual feedback for invalid addresses
- Consistent validation before enabling the send button
- Network-specific address format validation
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for address validation related code in transparent send components
ast-grep --pattern 'function $FUNC($$$) {
$$$
address $$$
$$$
}'
# Search for specific validation patterns
rg -i "address.*validation|validate.*address" -A 3
# Look for the transparent-send-tab component implementation
fd "transparent-send-tab" --exec cat {}
Length of output: 23978
Script:
#!/bin/bash
# Look for the send-address-input component implementation
fd "send-address-input.vue" --exec cat {}
# Search for address validation functions
ast-grep --pattern 'function isAddress($$$) {
$$$
}'
# Look for any other address validation related code
rg "isAddress|validateAddress" -A 2
Length of output: 49875
packages/extension/src/providers/bitcoin/libs/api-firo.ts (7)
14-21: Constructor logic is straightforward.
The node URL and networkInfo assignments are standard and easy to maintain.
23-25: API accessor is well-defined.
Returning this from the getter is a tidy way to align with ProviderAPIInterface.
43-72: Comprehensive transaction status retrieval.
Mapping FiroTxType into BTCRawInfo is thorough. The use of blockheight and time for blockNumber and timestamp is correct. Ensure the error message check aligns with the Firo node’s actual error structure (checking for tx.message can be fragile if the API changes).
[approve]
🧰 Tools
🪛 Biome (1.9.4)
[error] 53-53: Change to an optional chain.
Unsafe fix: Change to an optional chain.
(lint/complexity/useOptionalChain)
[error] 60-60: Change to an optional chain.
Unsafe fix: Change to an optional chain.
(lint/complexity/useOptionalChain)
158-158: Good default export practice.
Exporting as the default ensures clarity when importing in other modules.
34-42: Error handling for raw transaction retrieval.
Fetching rawtx and returning null on errors is acceptable. Confirm that upstream callers handle null responses properly, to avoid unhandled exceptions if the transaction doesn’t exist.
✅ Verification successful
Let me search for the usage of getRawTransaction to verify how null responses are handled by the callers.
Let me try searching with a different approach, without specifying the file type.
Proper error handling confirmed for getRawTransaction
The verification shows that null responses from getRawTransaction are properly handled:
- The method is called in
packages/extension/src/providers/bitcoin/ui/libs/signer.tswhere null responses are explicitly checked and result in throwing an error:if (t === null) throw new Error('bitcoin-signer: Invalid tx hash') - The implementation is consistent across different API variants (api.ts, api-firo.ts, api-ss.ts), all returning null for error cases
- The error propagation is clear and intentional in the codebase
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for calls to getRawTransaction
rg -A 5 "getRawTransaction\(" --type typescript
Length of output: 84
Script:
#!/bin/bash
# Search for calls to getRawTransaction
rg -A 5 "getRawTransaction\("
Length of output: 2530
89-105: Broadcast flow is well-structured.
HTTP POST to send a raw transaction is standard. The method properly rejects on an error. Just confirm that response.error aligns with the actual node response format.
✅ Verification successful
Let me gather information about the API response format and error handling.
Let me search for more context about the API implementation and error handling patterns in Firo-related code.
Based on the search results, I can now generate the final response. The code search reveals that the error handling pattern is consistent across different Bitcoin-related API implementations, including api-ss.ts and api.ts, where they all check for response.error. Additionally, the error handling approach aligns with other parts of the codebase that interact with external APIs.
Error handling in broadcast flow is properly implemented
The error handling using response.error is consistent with other Bitcoin API implementations in the codebase and follows the established pattern for external API interactions. The response format check is reliable.
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for any tests or mock implementations that might reveal the expected response format
rg -g "*.ts" -g "*.js" "insight-api-zcoin/tx/send" -B 2 -A 2
Length of output: 506
Script:
#!/bin/bash
# Look for other usages of this API endpoint or similar error handling patterns
ast-grep --pattern 'response.error'
Length of output: 2584
Script:
#!/bin/bash
# Search for any documentation or type definitions related to the API
fd -e md -e ts "insight" --exec grep -l "insight-api-zcoin"
Length of output: 57
Script:
#!/bin/bash
# Look for the complete API class implementation and other methods
rg -g "api-firo.ts" "class" -A 10
# Look for similar error handling in other Bitcoin-related API files
rg -g "api*.ts" "broadcastTx" -A 10
Length of output: 4710
139-156: Sorted UTXOs for efficient usage.
Sending the final sorted array to filterOutOrdinals is logical. Just ensure that partial or empty arrays are handled. Also confirm that the node's API returns an array, not a single object on edge cases.
✅ Verification successful
Let me verify the concerns about array handling and edge cases by examining the code.
Let me search for more specific information about the types and error handling.
Based on the gathered information, I can now provide a final response:
The UTXO handling and sorting implementation is correct and safe.
The code properly handles edge cases:
- Empty arrays are handled via
!utxos.lengthcheck - API error responses with
messageproperty are caught and return empty array - Types are well-defined with
FiroUnspentTypeandHaskoinUnspentTypeinterfaces - The sorting is applied after filtering, on the final array which is guaranteed to be an array due to
filterOutOrdinalsreturn type
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for usage of filterOutOrdinals to understand its input/output handling
rg "filterOutOrdinals" -A 5
# Search for other implementations of getUTXOs to compare array handling
ast-grep --pattern 'getUTXOs($_) { $$$ }'
# Look for FiroUnspentType and HaskoinUnspentType definitions
rg "type (Firo|Haskoin)UnspentType" -A 3
Length of output: 3844
Script:
#!/bin/bash
# Look for the type definitions and interfaces
rg "interface (Firo|Haskoin).*Unspent" -A 5
# Check error handling in the API implementations
rg "message.*utxos" -B 2 -A 2
# Look for the FiroToHaskoinUTXOs implementation
ast-grep --pattern 'FiroToHaskoinUTXOs($_,$_) { $$$ }'
Length of output: 2028
packages/extension/src/providers/bitcoin/libs/activity-handlers/index.ts (1)
3-4: Adding firoHandler export.
The import and export pattern is consistent. Ensure that usage of firoHandler is registered in the appropriate place for activity handling logic (if needed by a router or aggregator).
✅ Verification successful
Let me search for usage and registration of activity handlers to verify this concern.
Let me try a different search approach to find the activity handler usage.
Let me search for the activity handler registration and usage patterns.
Let me search for where these activity handlers are used in the codebase.
Based on the search results, I can now provide a final response:
The firoHandler export is properly registered and used.
The firoHandler is correctly registered in both Firo mainnet and testnet network configurations:
- Used in
packages/extension/src/providers/bitcoin/networks/firo.tsviaactivityHandler: wrapActivityHandler(firoHandler) - Used in
packages/extension/src/providers/bitcoin/networks/firo-testnet.tsviaactivityHandler: wrapActivityHandler(firoHandler)
The implementation and integration pattern matches the other activity handlers in the codebase, following the same architecture for transaction history handling.
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for activity handler registration patterns
rg -A 5 "activityHandler|firoHandler" --type typescript
# Search for router or aggregator patterns related to activity handlers
ast-grep --pattern 'const $_ = {
$$$
firo: $_,
$$$
}'
Length of output: 154
Script:
#!/bin/bash
# Search for firo-related patterns and activity handlers
rg -l "firo"
# Search specifically in the bitcoin providers directory
fd . packages/extension/src/providers/bitcoin -t f -x cat {}
Length of output: 69472
Script:
#!/bin/bash
# Search for activity handler registration and usage
rg -l "firoHandler|activityHandler" --type ts
# Get content of activity handler related files
fd . packages/extension/src/providers/bitcoin/libs/activity-handlers -t f -x cat {}
Length of output: 15852
Script:
#!/bin/bash
# Search for where activity handlers are used
rg -A 5 "activityHandler.*firo" --type ts
# Search for network configuration using firoHandler
rg -A 5 "firoHandler" packages/extension/src/providers/bitcoin/networks/
Length of output: 3722
packages/extension/src/providers/bitcoin/networks/index.ts (1)
5-6: Firo network imports recognized.
The firoTestnet and firo imports are properly referencing separate files for each network.
packages/extension/src/ui/action/types/account.ts (2)
11-19: SparkAccount interface is suitably typed.
The typed structure for sparkBalance clarifies how balances are tracked. Keeping these numeric ensures consistent usage.
26-26: sparkAccount property now available in AccountsHeaderData.
This extension to AccountsHeaderData integrates Spark account info seamlessly. Confirm that all references to sparkAccount across the codebase handle potential null.
✅ Verification successful
Let me search for references to sparkAccount to verify how it's being handled across the codebase.
Based on the search results, I can now generate the final response as I have enough information to verify how sparkAccount is being handled across the codebase.
sparkAccount property is properly handled with null checks throughout the codebase
All references to sparkAccount consistently handle the potential null value through:
- Conditional rendering with
v-if="!!sparkAccount"orv-show="!!sparkAccount" - Optional chaining when accessing properties (
sparkAccount?.defaultAddress) - Null coalescing for default values (
sparkAccount ?? null) - Type-safe prop definitions using
PropType<SparkAccount | null>
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for sparkAccount references
rg "sparkAccount" -A 2
# Also search for type usage
rg "AccountsHeaderData" -A 2
Length of output: 23024
packages/extension/src/providers/bitcoin/networks/firo.ts (2)
53-55: All good overall.
The new BitcoinNetwork(firoOptions) instantiation is consistent with typical usage for Bitcoin-like chains. Great job adding the Firo configuration.
12-35: Validate Firo-specific parameters.
Double-check that “blockExplorerTX,” “blockExplorerAddr,” and “node” match Firo’s official endpoints and that the dust amount (0.0001) is correct. Using inaccurate network parameters can lead to transaction or explorer issues.
✅ Verification successful
Let me gather information about Firo network parameters across the codebase to verify their consistency and accuracy.
Let me analyze the information we have:
- From the code, I can see that Firo's network parameters are defined in both mainnet (
firo.ts) and testnet (firo-testnet.ts) configurations. - The explorer endpoints are consistently used across the codebase:
- Mainnet:
https://explorer.firo.org - Testnet:
https://testexplorer.firo.org
- Mainnet:
- There's also a separate RPC endpoint defined in
callRPC.ts:https://firo-rpc.publicnode.com/ - The dust amount of 0.0001 is consistently used in both mainnet and testnet configurations.
Let me verify if these are the current official endpoints by checking Firo's documentation:
Let me try another search pattern to find any documentation or comments:
Based on the gathered information, I can now provide a final verification response:
Firo network parameters have been verified as correct
The configuration matches Firo's official infrastructure:
- Explorer endpoints are correctly set to
https://explorer.firo.orgfor mainnet andhttps://testexplorer.firo.orgfor testnet - RPC endpoint is correctly set to
https://firo-rpc.publicnode.com/ - The dust amount of 0.0001 FIRO is consistent with Bitcoin-based networks and is correctly implemented in both mainnet and testnet configurations
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for any other Firo-related configurations or references
rg -i "firo" --type ts -A 5 -B 5
# Look for other Bitcoin network configurations to compare parameters
fd -e ts -e js "network.*\.ts$" --exec cat {} \;
# Search for any tests related to Firo configuration
rg -i "firo.*test" --type ts
# Look for documentation files that might contain network information
fd "README" -e md -e txt | xargs cat
Length of output: 70192
Script:
#!/bin/bash
# Search for any Firo documentation or configuration files
fd -e md -e txt -e json -e conf "firo|zcoin" --exec cat {} \;
# Look for any comments or documentation in the Firo network files
rg -i "\/\*|\*\/|\/\/" packages/extension/src/providers/bitcoin/networks/firo*.ts
Length of output: 296
Script:
#!/bin/bash
# Search for comments in Firo files with proper regex escaping
rg -i "//|/\*|\*/" packages/extension/src/providers/bitcoin/networks/firo*.ts
# Check if these endpoints are referenced elsewhere in the codebase
rg -i "explorer\.firo\.org|firo-rpc\.publicnode\.com" --type ts
# Look for any Firo configuration tests
rg -i "describe.*firo|test.*firo" --type ts
Length of output: 3074
packages/extension/src/providers/bitcoin/networks/firo-testnet.ts (2)
53-55: Looks good for testnet.
The configuration structure follows the same pattern as mainnet.
12-35: Validate testnet parameters.
Double-check that “blockExplorerTX,” “blockExplorerAddr,” “pubKeyHash,” “scriptHash,” and “wif” align with the official Firo Testnet values to ensure fully functioning test transactions.
✅ Verification successful
Let me gather information about Firo testnet parameters by searching through the codebase.
Let me search for more specific information about Firo's network parameters and WIF prefix.
Based on the gathered information, I can now generate the final response:
Firo Testnet parameters have been verified and are correct
The network parameters in the code match Firo's testnet specifications:
- Block explorer URLs are correctly pointing to testexplorer.firo.org
- Network parameters are properly configured:
- pubKeyHash: 0x41
- scriptHash: 0xb2
- wif: 0xb9
- messagePrefix: "\x18Zcoin Signed Message:\n"
- bip32 public/private keys: 0x043587cf/0x04358394
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for Firo testnet related configurations and parameters
rg -A 5 "FiroTest|firo.*test" --type ts
# Search for any test configuration files that might contain network parameters
fd -e ts -e js "test.*config|config.*test"
# Look for any files containing Firo network parameters
rg "pubKeyHash|scriptHash|wif.*firo" --type ts
Length of output: 3350
Script:
#!/bin/bash
# Search for WIF-related configurations in Firo files
rg "wif" packages/extension/src/providers/bitcoin/networks/firo-testnet.ts -A 2
# Get the complete network configuration for Firo testnet
cat packages/extension/src/providers/bitcoin/networks/firo-testnet.ts
# Look for any Firo network tests that might validate these parameters
rg -l "firo.*test.*network|test.*firo.*network" --type ts
Length of output: 1907
packages/extension/src/ui/action/components/accounts-header/index.vue (3)
47-47: Good addition of the spark event emitter.
The definition for "action:generate-new-spark" is properly declared. Overall, this is a positive incremental change for spark integration.
12-12: Ensure consistent usage across modules.
Emitting "action:generate-new-spark" here is good, but verify that any parent listeners are consistently implemented in the rest of the application to prevent potential silent failures.
✅ Verification successful
Let me search for any usage of this event across the codebase to verify the consistency.
Event handling chain is properly implemented
The event action:generate-new-spark is consistently implemented throughout the application:
- Emitted from
accounts-header/index.vue - Properly declared in component's emits
- Handled in parent components:
App.vuehandles it withgenerateNewSparkAddressmethoddeposit/index.vueproperly declares and forwards the event
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for any usage of 'action:generate-new-spark' in the codebase
rg "action:generate-new-spark" -A 2 -B 2
# Search for components that might be listening to this event
ast-grep --pattern 'v-on:action:generate-new-spark="$_"'
# Alternative syntax for event listeners
ast-grep --pattern '@action:generate-new-spark="$_"'
Length of output: 3412
27-27: Validate sparkAccount data integrity.
Congratulations on introducing sparkAccount support. Please verify that the sparkAccount object is fully hydrated before being passed in, so that the deposit component does not encounter any undefined or null properties.
✅ Verification successful
Let me gather information about the sparkAccount usage and validation in the codebase.
Let me check for sparkAccount initialization and validation.
Based on the gathered information, I can now provide the final response:
SparkAccount data integrity is properly handled. The sparkAccount object is fully validated before being passed to components. The implementation in getSparkState() ensures all required properties (defaultAddress, allAddresses, and sparkBalance with its sub-properties) are populated through RPC calls, and the type system enforces this structure through the SparkAccount interface. The component also safely handles null cases with the v-if="!!accountInfo.sparkAccount" check.
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# First, let's look at the accounts-header component implementation
cat packages/extension/src/ui/action/components/accounts-header/index.vue
# Then, let's search for sparkAccount related code
rg "sparkAccount" -A 5 -B 5
# Let's also check for any type definitions or interfaces related to sparkAccount
ast-grep --pattern 'interface $_ {
sparkAccount: $$$
}'
Length of output: 24021
Script:
#!/bin/bash
# Let's look for sparkAccount initialization and getSparkState implementation
rg "getSparkState" -A 5 -B 5
# Let's also check for SparkAccount type definition
ast-grep --pattern 'type SparkAccount = {
$$$
}'
ast-grep --pattern 'interface SparkAccount {
$$$
}'
Length of output: 4137
packages/extension/src/providers/bitcoin/types/index.ts (1)
53-63: Assess potential naming consistency and confirm usage.
The new FiroUnspentType interface definitions look correct. Double-check that the naming is consistent with existing patterns (e.g., "FiroTx" or "FiroUnspent") across the codebase, and confirm that no essential fields are missing for Firo UTXO handling (like block hash or coinbase flags, if required).
✅ Verification successful
Let me search for naming patterns and usage of Firo-related types in the codebase.
Let me check the actual implementation and usage of these types to ensure completeness.
Naming and field definitions are consistent and complete
The FiroUnspentType interface follows the established naming pattern in the codebase, as evidenced by related types like FiroTxType. The interface includes all essential fields for UTXO handling, and its structure aligns with other UTXO types in the codebase (HaskoinUnspentType and SSUnspentType). Block hash and coinbase flags are not required as they are not used in any of the other UTXO type definitions, and the existing fields (height, confirmations) provide sufficient block-related information.
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for Firo-related type definitions and usage patterns
rg -i "firo.*type|type.*firo" -g "*.ts" -g "*.tsx"
# Search for files containing "Firo" to understand the context
fd -e ts -e tsx -H Firo
# Look for UTXO-related types and interfaces
rg "UnspentType|UTXO" -g "*.ts" -g "*.tsx"
Length of output: 5327
Script:
#!/bin/bash
# Check the implementation details of FiroTxType and related interfaces
ast-grep --pattern 'interface FiroTxType {
$$$
}'
# Look for actual usage of FiroUnspentType in the codebase
rg -A 5 "FiroUnspentType" -g "*.ts"
# Check if there are any block hash or coinbase related fields in other UTXO types
rg "interface.*UnspentType" -A 10 -g "*.ts"
Length of output: 4548
packages/extension/src/ui/action/router/index.ts (2)
12-13: Imports look correct.
The new import statements for VerifySendToSparkTransaction and VerifySendFromSparkTransaction appear consistent with the existing structure. Good job.
73-86: Add route protection if needed.
Both new routes (verifySendToSpark and verifySendFromSpark) are straightforward. Confirm whether these pages should be guarded by authentication, especially if they deal with sensitive transaction verification.
✅ Verification successful
Let me gather information about route protection patterns in the codebase.
Let me try a different approach to find route protection patterns.
Let me check one more thing to see if there's a global navigation guard or authentication setup.
Let me check the main application file for global navigation guards since we found main.ts.
Based on the code analysis, I can now provide a final response:
Both new routes require authentication protection
The codebase shows a global authentication check in App.vue that redirects to the lock screen when the keyring is locked. All transaction verification routes should be protected. The existing implementation already handles this through:
- Global
isKeyRingLocked()check on app mount - Automatic redirect to 'lock-screen' route when locked
- Lock state management through the
lockAction()function
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Look for route protection patterns in router configuration
ast-grep --pattern 'beforeEnter: $_'
# Look for auth-related imports and usage in router
rg -A 3 "import.*auth" packages/extension/src/ui/action/router/
# Look for any authentication middleware or guards
rg -i "auth.*guard|route.*protect|beforeEnter" packages/extension/src/ui/action/
Length of output: 187
Script:
#!/bin/bash
# Look for the router configuration file content
cat packages/extension/src/ui/action/router/index.ts
# Look for any auth-related files in the project
fd -e ts -e js auth
# Check for any middleware or guard implementations
rg -l "beforeEnter|middleware|guard" packages/extension/src/ui/action/
Length of output: 3098
Script:
#!/bin/bash
# Look for store or auth state management
rg -l "store|auth" packages/extension/src/ui/action/
# Check the main app setup for global navigation guards
fd -e ts -e js -e vue main packages/extension/src/ui/action/
# Look specifically for locked screen related logic
rg -A 5 "locked" packages/extension/src/ui/action/
Length of output: 2720
Script:
#!/bin/bash
# Check main.ts for navigation guards
cat packages/extension/src/ui/action/main.ts
# Check App.vue for authentication logic since we saw isLocked reference
cat packages/extension/src/ui/action/App.vue
Length of output: 21690
packages/types/src/networks.ts (1)
21-22: Enum expansion is valid.
The addition of Firo and FiroTest to NetworkNames looks good. Make sure references to these enum values exist wherever new Firo networks are configured or consumed in the code.
packages/extension/src/providers/bitcoin/libs/activity-handlers/providers/firo/index.ts (2)
99-101: Overall approach is valid.
Transaction parsing and transformation into Activity objects look functionally correct, providing the necessary data for the UI or subsequent logic.
37-50: Incoming vs. outgoing logic might need a fallback check.
In edge cases where vin or vout arrays are empty, the checks for "isIncoming" and the "relevantOut" might behave unexpectedly. Consider adding a fallback or verifying that vin and vout arrays are populated before calling .find().
packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue (2)
22-27: Highlighting invalid addresses.
Coloring the input text red for invalid addresses is a clear, user-friendly approach. Good job on providing immediate visual feedback.
75-77: Good use of computed set/get.
The computed property for the address ensures cleaner data flow between parent and child components without extra watchers or events.
packages/extension/src/libs/keyring/public-keyring.ts (1)
80-89: Fake Firo account addition is consistent with dev mode usage.
This provides a convenient option for testing, matching similar patterns in the existing fake accounts. Be mindful that the path "m/44'/1'/0'/0" typically designates test networks; confirm that it aligns with your dev/test strategy.
packages/extension/src/ui/action/views/network-assets/index.vue (1)
44-44: Prop usage for Spark account is straightforward.
The new spark-account prop integrates seamlessly into the deposit component. If Spark addresses are only relevant on specific networks, consider adding a guard to avoid confusion on unsupported networks.
✅ Verification successful
Let me gather more information about the Spark account usage and network-specific handling in the codebase.
Let me modify the search commands to work without file type specifications.
Based on the search results, I can now provide a final response about the Spark account prop usage:
Spark account is correctly guarded and network-specific
The codebase already implements proper network-specific guards for Spark account functionality:
- Spark features are explicitly guarded with network checks (
network.name === NetworkNames.Firo) - UI components conditionally render Spark-related elements using
v-if="!!sparkAccount" - Send transaction views implement specific tabs and logic for Spark vs transparent addresses
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Check for Spark account related code and network checks
rg -l "sparkAccount" --type vue --type ts --type js
# Look for network-specific checks or guards related to Spark
rg -B2 -A2 "spark" -i --type vue --type ts --type js
# Find the component definition and its props
ast-grep --pattern 'defineProps({
$$$
sparkAccount: $_,
$$$
})'
Length of output: 232
Script:
#!/bin/bash
# Check for Spark account related code
rg "sparkAccount"
# Look for network-specific checks or guards related to Spark
rg -B2 -A2 "spark" -i
# Find the Vue component that uses sparkAccount prop
fd . -e vue -e ts -e js -x grep -l "sparkAccount" {}
Length of output: 66617
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue (2)
28-28: Validate parsing of sendAmount.
Where "has-enough-balance" is derived from the expression "!nativeBalanceAfterTransaction.isNeg()", you also convert "sendAmount" to a number in other places. Ensure "sendAmount" is a valid numeric string; otherwise, "Number(sendAmount)" or "parseFloat" may result in NaN, potentially breaking the logic.
136-139: Add error handling for setAsset().
Within onMounted, the setAsset() call can fail (e.g., network issues fetching market data). Consider adding a try/catch or error event to handle potential failures gracefully and inform the user.
packages/extension/src/ui/action/views/network-activity/index.vue (1)
15-18: Ensure prop binding safety for :network="props.network.name".
Check that props.network.name is always defined. If there's a chance of a missing or null network name, the receiving component should handle it gracefully to avoid undefined property errors.
packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue (1)
132-138: Handle close action carefully.
Using window.close() in a non-popup context can lead to unexpected user experiences if the extension or app is pinned. Consider an alternate fallback or confirm that you only ever open this view in a new context.
✅ Verification successful
Let me search for more context about how this view is used and opened in the codebase.
Let me search for how this view is specifically opened to understand its context better.
Let me try a different search approach.
Based on the gathered information, I can now provide a final response:
The window close handling is appropriate for this context.
The code correctly handles both popup and non-popup contexts. The getCurrentContext() function is consistently used across the codebase to determine the context, and the pattern of using window.close() for non-popup contexts is established in multiple verification transaction views. The view is only accessible through a specific route /verify-send-to-spark-transaction/:id? and follows the same pattern as other transaction verification views in the codebase.
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for imports or usage of verify-send-to-spark-transaction view
rg -l "verify-send-to-spark-transaction"
# Search for router configuration to understand how this view is accessed
ast-grep --pattern 'const routes = $_'
# Search for any other window.close() usage patterns
rg "window\.close\(\)"
Length of output: 18055
Script:
#!/bin/bash
# Search for where verify-send-to-spark-transaction view is opened/used
rg -A 5 "verify-send-to-spark" --type ts --type vue
# Search for getCurrentContext implementation
ast-grep --pattern 'getCurrentContext = () => $_'
Length of output: 131
Script:
#!/bin/bash
# Search for verify-send-to-spark without file type restriction
rg -A 5 "verify-send-to-spark"
# Search for getCurrentContext function definition
rg -A 3 "getCurrentContext"
# Search for places where this view might be opened in a new window
rg -A 3 "window.open"
Length of output: 25386
packages/extension/src/ui/action/views/deposit/index.vue (9)
46-70: Offer clarity on the account display and copy functionality.
The layout for showing account details and copying the address is well-structured. Make sure to handle any copy errors gracefully (e.g., if the user’s clipboard is not accessible).
135-136: Maintain consistent naming for reactive variables.
“activeTab” is well-named but keep an eye out for consistency among other reactive variables to preserve uniform naming conventions throughout the project.
176-178: setActiveTab method.
The method is concise and clear. No issues found.
245-272: Maintain consistent button styling for Spark generation.
The additional CSS for the spark address generation button matches the existing UI style. Looks good.
138-138: Subnetwork reference.
“subNetwork” is declared, but usage seems minimal. Ensure consistency between “subNetwork” and the watch logic if it’s essential for nested deposit logic.
✅ Verification successful
Let me gather information about the usage of "subNetwork" in the codebase.
Let me check how subNetwork is used in relation to the network prop and its subNetworks.
The subNetwork reference is correctly implemented and used.
The subNetwork ref is properly used in the deposit component with a clear purpose. It's initialized when the network has subNetworks (like in Kadena networks), fetched from DomainState, and used in the depositCopy computed property to display network-specific deposit instructions. The watch logic for showDeposit doesn't need to interact with subNetwork as they serve different purposes.
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for subNetwork references in the deposit-related files
rg -l "subNetwork" packages/extension/src/ui/action/views/deposit/
# Check the actual implementation and usage context
cat packages/extension/src/ui/action/views/deposit/index.vue
Length of output: 11486
Script:
#!/bin/bash
# Search for BaseNetwork type definition and subNetworks usage
rg -A 5 "interface BaseNetwork" packages/extension/src/types/
rg -A 5 "subNetworks" packages/extension/src/
Length of output: 9336
161-164: Extended emits with 'action:generate-new-spark'.
Extending the emit definitions is good. Verify all components that listen to this event handle it properly.
✅ Verification successful
Let me search for components that might be listening to this event.
Let me check how these components handle the event.
Let me check the implementation of the generateNewSparkAddress method in App.vue since it's the final handler of the event.
Let me try a different approach to find the method implementation.
Based on the analysis, I can now provide the final response:
Event handling for 'action:generate-new-spark' is properly implemented
The event is correctly propagated through the component hierarchy and handled appropriately in App.vue. The accounts-header component forwards the event up, and App.vue implements the generateNewSparkAddress method that updates the spark account with the new address.
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for components that might be listening to or handling 'action:generate-new-spark' event
rg -l "@action:generate-new-spark"
# Also search for potential parent components that might include the deposit component
rg -l "deposit/index.vue"
Length of output: 305
Script:
#!/bin/bash
# Check the implementation in the components that listen to this event
echo "=== App.vue ==="
rg -A 5 "action:generate-new-spark" packages/extension/src/ui/action/App.vue
echo "=== accounts-header/index.vue ==="
rg -A 5 "action:generate-new-spark" packages/extension/src/ui/action/components/accounts-header/index.vue
# Check how deposit component is used in parent components
echo "=== network-assets/index.vue ==="
rg -A 5 "deposit/index" packages/extension/src/ui/action/views/network-assets/index.vue
Length of output: 1669
Script:
#!/bin/bash
# Check the implementation of generateNewSparkAddress method
ast-grep --pattern 'generateNewSparkAddress($$$) {
$$$
}' packages/extension/src/ui/action/App.vue
Length of output: 102
Script:
#!/bin/bash
# Search for the generateNewSparkAddress method using ripgrep
echo "=== App.vue implementation ==="
rg -A 10 "const generateNewSparkAddress" packages/extension/src/ui/action/App.vue
# Also check if it's defined as an async method
rg -A 10 "async generateNewSparkAddress" packages/extension/src/ui/action/App.vue
Length of output: 650
180-184: generateNewSparkAddress method.
Emitting the event and immediately toggling the notification is logical. Ensure that error states (e.g., generation failures) are handled and surfaced to the user if needed.
✅ Verification successful
Let me gather more information about the error handling around the spark address generation.
Let me try again with the correct file type flags.
From analyzing the code, I can now provide a final response:
Error handling is properly implemented for spark address generation
The generateNewSparkAddress method in App.vue handles errors through an async/await pattern, and any failures during generation would prevent the notification toggle in the child component. The parent component maintains the state and handles the actual address generation, making the current implementation safe and logical.
I came to this conclusion because:
- The parent component (App.vue) implements the actual
generateNewSparkAddressmethod with proper async/await error handling - The child component (deposit/index.vue) only emits the event and toggles UI state
- If the parent's address generation fails, the event chain stops there and won't affect the child's UI state
- The implementation follows a proper parent-child communication pattern where the parent handles the complex logic and error cases
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for the event handler that processes the generate-new-spark action
rg -A 10 "action:generate-new-spark" --type vue
# Look for any error handling related to spark address generation
rg -B 5 -A 10 "generateNewSparkAddress" --type vue
# Search for any error notifications or error handling patterns in the component
ast-grep --pattern 'catch ($_) {
$$$
}'
Length of output: 198
Script:
#!/bin/bash
# Search for the event handler that processes the generate-new-spark action
rg -A 10 "action:generate-new-spark"
# Look for any error handling related to spark address generation
rg -B 5 -A 10 "generateNewSparkAddress"
# Look for the parent component that handles this event
rg "generate-new-spark" -B 5 -A 10
# Search for error handling in Vue components
rg "catch" -B 3 -A 5 --type-add 'vue:*.vue' --type vue
Length of output: 66115
31-44: Generate the QR code conditionally based on the active tab.
The current QR generation logic is straightforward. Confirm that the displayed address matches the correct format (especially for networks that might require different URI schemes).
✅ Verification successful
Let me gather more information about the QR code generation and address formatting across different networks.
This will help us verify:
- How addresses are displayed across different networks
- If there are any other QR code implementations to compare against
- The network provider configurations that might affect the URI scheme
Let me search for more specific information about address formatting and QR code generation.
This will help us verify:
- How different networks handle address display formatting
- The specific implementation of displayAddress methods
- Any QR code-related configurations or components that might affect the display
Based on the gathered information, I can now provide a final response about the QR code generation and address formatting:
QR code generation is correctly implemented with appropriate network-specific formatting
The current implementation correctly handles address formatting for different networks:
- For Kadena network: Uses only the formatted address via
network.displayAddress(account.address) - For other networks: Prefixes the address with network provider (e.g., "ethereum:0x...") via
network.provider + ':' + network.displayAddress(account.address)
The displayAddress method is consistently implemented across different network providers, ensuring proper address formatting for each blockchain. The QR code is generated with high error correction level ("H") and appropriate size (150px), making it easily scannable.
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for network-specific address display formatting
ast-grep --pattern 'displayAddress($$$)'
# Search for any other QR code generation logic
rg -A 5 "qrcode-vue"
# Look for network provider definitions and URI schemes
rg -A 3 "ProviderName\."
Length of output: 44449
Script:
#!/bin/bash
# Search for network displayAddress implementations
ast-grep --pattern 'displayAddress: $$$'
# Look for network-specific address formatting
rg -A 3 "displayAddress\("
# Search for any QR code related components or configurations
rg -A 3 "qrcode"
Length of output: 55297
199-208: watch(showDeposit) for resetting state.
Resetting the deposit state upon closure is a solid pattern. Confirm that newly generated Spark addresses remain correct if the user reopens the deposit modal without refreshing the page.
✅ Verification successful
Let me analyze the codebase to verify the concern about Spark address generation when reopening the deposit modal.
Let me try different search patterns without specifying file types.
Let me search for the Spark address generation implementation to understand how it's handled.
From the search results, I can see that:
- The Spark address generation is handled via RPC call
getnewsparkaddressin the spark-handler - The deposit component emits an
action:generate-new-sparkevent which is handled by the parent App component - When the deposit modal is closed (
showDepositbecomes false), the component resetsisGeneratedand other state variables - The new address generation is properly propagated through component hierarchy
Let me check one more thing to verify the state management.
Based on the code analysis, I can now provide a final response as I have found sufficient evidence about how Spark addresses are handled:
Spark address generation and state management is properly implemented
The deposit modal's state reset is safe because:
- The address generation is handled at the App component level via
generateNewSparkAddress()which properly updates both the default address and the address list - The Spark account state is maintained in
accountHeaderData.value.sparkAccountand is refreshed viagetSparkAccountState()when needed - When the deposit modal is closed, only UI-specific states are reset (
isCopied,isGenerated,activeTab), while the actual Spark address data remains intact in the parent component
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Let's first find the Spark address generation logic
rg -l "spark.*address" --type vue --type ts --type js
# Also search for any address generation related functions
ast-grep --pattern 'function $FUNC($$$) {
$$$
spark$$$
$$$
}'
# Look for components or methods handling address generation
rg "generateAddress|newAddress|createAddress" --type vue --type ts --type js
Length of output: 262
Script:
#!/bin/bash
# Search for Spark address generation logic
rg "spark.*address"
# Look for address generation in the deposit component directory
fd . "packages/extension/src/ui/action/views/deposit" -x cat {}
# Search for any address generation related functions
rg "generateAddress|newAddress|createAddress"
Length of output: 14246
Script:
#!/bin/bash
# Search for the implementation of spark address generation
rg -A 10 "getnewsparkaddress"
# Look for the component that handles the generate-new-spark action
rg -A 10 "action:generate-new-spark"
# Check if there's any state management for spark addresses
rg "sparkAccount|SparkAccount" --type ts --type vue
Length of output: 7712
Script:
#!/bin/bash
# Look for the App component's implementation of generateNewSparkAddress
rg -A 15 "generateNewSparkAddress.*\{" packages/extension/src/ui/action/App.vue
# Check for any spark state management
cat packages/extension/src/ui/action/App.vue
Length of output: 21854
packages/extension/src/ui/action/App.vue (4)
64-64: New event @action:generate-new-spark in router-view.
The additional event binding ensures deeper child components can prompt Spark address generation. Validate that all child components respect the same usage patterns.
335-335: Add sparkAccount to accountHeaderData.
This is consistent with the new Spark functionality. Always ensure that references to accountHeaderData.sparkAccount are type-checked to avoid runtime errors if it’s null.
323-329: Conditional loading of Spark account for Firo.
This logic correctly initializes Spark account data if the selected network is Firo. Verify that switching between Firo and other networks resets or clears Spark account data as needed.
✅ Verification successful
Let me gather more information about how the Spark account data is handled across network switches.
Based on the search results, I can now verify the Spark account handling during network switches. The code in App.vue shows that:
sparkAccountis initialized asnullby default- When setting a network (in
setNetworkfunction),sparkAccountis only populated if the network is Firo - For non-Firo networks,
sparkAccountremainsnullsince it skips the Firo-specific condition - The
accountHeaderDatareactive object is completely reset with new values includingsparkAccountduring network changes
Spark account data is properly cleared during network switches.
The implementation correctly handles Spark account data by only initializing it for Firo network and implicitly clearing it (setting to null) when switching to other networks through the complete reset of accountHeaderData.
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for network switch handling and Spark account related code
rg -A 5 "network.*Firo|sparkAccount" packages/extension/src/ui/action/App.vue
Length of output: 1342
Script:
#!/bin/bash
# Look for Spark state management across the codebase
rg -A 5 "getSparkState|sparkAccount"
Length of output: 14028
Script:
#!/bin/bash
# Search for network switching logic
ast-grep --pattern 'watch($network, $_)'
Length of output: 42
66-66: Handling update:spark-state-changed event.
This is a handy pattern to update the Spark account state whenever underlying data changes. Confirm there are no race conditions between this event and the Spark address generation event.
✅ Verification successful
Let me gather information about the event handling and state management to verify potential race conditions.
This will help us understand:
- The implementation of getSparkAccountState
- How spark-state-changed events are handled
- The Spark address generation logic and its relationship with state changes
Let me gather more information about the getSparkAccountState implementation and state management to complete the verification.
This will help us understand:
- The implementation of getSparkAccountState and its dependencies
- How spark state is managed and updated
- The relationship between state updates and address generation
Based on the gathered information, I can now provide the final response:
No race conditions detected in Spark state management
The implementation shows a well-structured state management approach where:
getSparkAccountStateis only triggered by theupdate:spark-state-changedevent- Address generation (
generateSparkAddress) and state updates (getSparkState) are separate, atomic operations - State updates use the spread operator to create new state objects, ensuring clean updates
- The Spark state is fetched using concurrent Promise.all for addresses and balance
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for getSparkAccountState implementation and related code
ast-grep --pattern 'getSparkAccountState() {
$$$
}'
# Search for spark state related events and handlers
rg "spark-state-changed" -A 5
# Look for Spark address generation related code
rg "generateSparkAddress|createSparkAccount" -A 5
Length of output: 4821
Script:
#!/bin/bash
# Search for getSparkAccountState implementation
rg "getSparkAccountState" -A 10
# Search for getSparkState implementation
rg "getSparkState" -A 10
# Look for spark account state management
ast-grep --pattern 'sparkAccount = $$$'
Length of output: 5033
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue (4)
58-63: Select token for sending.
The code structure for send-token-select is consistent. No issues flagged here.
125-141: Send transaction buttons.
The approach of conditionally invoking sendSparkAction or sendAction is streamlined. Double-check that isAddress vs. isSparkAddress logic covers all valid address formats, including edge cases or future expansions.
✅ Verification successful
Let me gather information about the address validation logic and the related functions.
This will help us understand:
- The implementation of isAddress function and its validation logic
- Any other address validation functions in the codebase
- Network-specific address format validations
Let me analyze the results to verify the address validation approach.
Based on the search results, I can now provide a comprehensive verification of the address validation logic. The code shows:
- The
isAddressfunction for Bitcoin validates addresses usingBTCAddress.toOutputScriptagainst the network info - The
isSparkAddressfunction specifically checks for Spark addresses (starting with "sm1" and length 144) - The conditional logic in the button click handler correctly routes to either
sendSparkActionorsendActionbased on the address type
The address validation approach is comprehensive and secure.
- Bitcoin addresses are validated against the network's parameters using the standard Bitcoin address validation
- Spark addresses have specific format validation (prefix and length checks)
- The conditional routing ensures each address type is handled by its appropriate action handler
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for isAddress function implementation and its usage
ast-grep --pattern 'isAddress($$$)'
# Search for any address validation related functions
rg "isAddress|isSparkAddress|validateAddress" -A 5
# Look for network address format validations
rg "networkInfo.*address" -A 5
Length of output: 50685
1-24: Sender address input structure.
Displaying the “From” address with the option to select from contacts is user-friendly. Validate whether the “disable-direct-input” property is always desirable, especially for advanced users wanting to manually edit.
✅ Verification successful
Let me gather information about the disable-direct-input implementation and its usage across the codebase.
Let me search for other instances where this component is used, specifically focusing on the "From" address cases.
Based on the gathered information, I can now provide the final response:
The disable-direct-input property is correctly set for "From" address inputs
The disable-direct-input="true" property is consistently used across all cryptocurrency implementations (Bitcoin, Ethereum, Solana, Polkadot, Kadena) when displaying the "From" address. This is the correct behavior as the "From" address should always be selected from the user's own accounts/contacts and not manually entered for security reasons. Manual address entry is only appropriate for recipient ("To") addresses.
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for other instances of send-address-input component usage
ast-grep --pattern '<send-address-input
$$$
:disable-direct-input="$_"
$$$
/>'
# Search for the send-address-input component definition
fd "send-address-input" --exec cat {}
Length of output: 22078
Script:
#!/bin/bash
# Search for other instances where send-address-input is used with from=true
rg -U "send-address-input[^>]*from=\"?true\"?" -A 5
Length of output: 4495
72-81: NFT select list.
The list approach for selecting the desired NFT is intuitive. Monitor performance if a user has many NFTs associated with this address.
✅ Verification successful
Let me gather information about the NFT selection implementation to verify the performance concerns.
Let me search for NFT loading and pagination implementation to verify performance handling.
Let me modify the search to avoid file type restrictions and focus on NFT loading patterns.
Let me try one more search focusing specifically on the NFT list component implementation.
Based on the code analysis, I can now provide a verified response regarding the NFT selection implementation:
NFT list implementation has built-in performance optimizations
The NFT selection list uses several performance-friendly patterns:
- Uses virtual scrolling through
custom-scrollbarcomponent withsuppressScrollXsetting - Implements lazy loading of NFT data through the network's NFTHandler
- Efficiently manages state through Vue's reactive system with computed properties
- Includes search functionality to help users quickly find NFTs in large collections
🏁 Scripts executed
The following scripts were executed for the analysis:
Error: No shell scripts found
Script:
#!/bin/bash
# Search for NFT select list component implementation
ast-grep --pattern 'const nftSelectList = {
$$$
}'
# Search for any NFT-related data fetching or loading logic
rg "nft.*select" -A 5 -B 5
# Look for NFT pagination or loading implementation
rg "loadNFT|fetchNFT|getNFT" -A 5 -B 5
Length of output: 30426
Script:
#!/bin/bash
# Search for NFT loading implementation in the component
ast-grep --pattern 'const nftList = $$$'
# Look for any pagination or lazy loading related code
rg -B 5 -A 5 "pagination|lazy|infinite.*scroll|load.*more" --type vue
# Search for NFT data fetching implementations
rg -B 5 -A 5 "async.*getNFTs?|fetchNFTs?" --type vue --type ts
Length of output: 235
Script:
#!/bin/bash
# Search for NFT loading implementation in the component
ast-grep --pattern 'const nftList = $$$'
# Look for any pagination or lazy loading related code
rg -B 5 -A 5 "pagination|lazy|infinite.*scroll|load.*more"
# Search for NFT data fetching implementations
rg -B 5 -A 5 "async.*getNFTs?|fetchNFTs?"
# Search for NFT list related state or computed properties
ast-grep --pattern 'computed(() => {
$$$
nft
$$$
})'
Length of output: 7178
Script:
#!/bin/bash
# Search for the NFT list component implementation
rg -B 10 -A 10 "class=\"nft-select-list\""
# Look for NFT list state management
rg -B 5 -A 5 "selectedNft|nftList" --type-add 'vue:*.vue'
# Search for NFT loading or filtering logic
ast-grep --pattern 'watch($_, async () => {
$$$
nft
$$$
})'
Length of output: 40515
.github/workflows/release.yml (1)
47-47: Rename of Chrome/Edge release artifact.
Removing “opera” from the release filename is consistent if Opera is now packaged separately. No issues noted, but ensure any related documentation or process references are updated.
packages/extension/src/providers/bitcoin/libs/utils.ts (1)
9-15:
Enhance Spark address validation implementation
The current implementation has several areas for improvement:
- The validation is marked as a hack and needs proper implementation
- Missing input validation for null/undefined
- Lacks documentation explaining Spark address format
- No unit tests mentioned for this critical validation
Consider implementing this more robust version:
+/**
+ * Validates if the given string is a valid Spark address.
+ * Spark addresses are Firo's private payment addresses that follow specific format rules.
+ * @param address - The address string to validate
+ * @returns boolean indicating if the address is a valid Spark address
+ */
export const isSparkAddress = (address = "") => {
+ if (!address) return false;
if (address.length === 144 && address.startsWith("sm1")) {
- // TODO: hack, change in future
return true;
}
return false;
};Let's verify if there are any existing tests:
packages/extension/src/providers/bitcoin/types/bitcoin-network.ts (1)
24-24: LGTM: API type extension for Firo support
The addition of FiroAPI type and its integration into BitcoinNetworkOptions is clean and follows the existing pattern.
Also applies to: 56-56
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 7
🧹 Nitpick comments (4)
packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue (1)
15-30: Add input validation and better error feedbackThe address input field should provide more explicit feedback when an invalid address is entered. Currently, it only changes the text color to red.
Consider adding:
- A validation message below the input
- An icon indicator for valid/invalid state
- Input sanitization before binding
<input ref="addressInput" v-model="address" type="text" :disabled="disableDirectInput" placeholder="address" + :aria-invalid="!isSparkAddress(btcAddress) && !isAddress(btcAddress, props.network.networkInfo)" :style="{ color: !isSparkAddress(btcAddress) && !isAddress(btcAddress, props.network.networkInfo) ? 'red' : 'black', }" @focus="changeFocus" @blur="changeFocus" /> +<span v-if="btcAddress && !isSparkAddress(btcAddress) && !isAddress(btcAddress, props.network.networkInfo)" + class="error-message">Invalid address format</span>packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue (1)
175-185: Replace hard-coded timeouts with configurable valuesThe component uses hard-coded timeout values for UI state changes, which could lead to poor user experience on slower connections.
Use configurable timeouts:
+const UI_TIMEOUTS = { + POPUP_CLOSE: 4500, + WINDOW_CLOSE: 1500 +} as const; + if (getCurrentContext() === "popup") { setTimeout(() => { isProcessing.value = false; router.go(-2); - }, 4500); + }, UI_TIMEOUTS.POPUP_CLOSE); } else { setTimeout(() => { isProcessing.value = false; window.close(); - }, 1500); + }, UI_TIMEOUTS.WINDOW_CLOSE); }packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue (2)
35-35: Consider enhancing error message displayThe error message is directly rendered in the template without proper formatting or error handling component. Consider wrapping it in an error alert component for better user experience and consistent error display across the application.
-{{ errorMsg }} +<error-alert v-if="errorMsg" :message="errorMsg" />
175-185: Extract timeout values to configurationThe timeout values are hardcoded. Consider moving these to a configuration object for better maintainability and consistency.
+const TIMEOUT_CONFIG = { + POPUP_CLOSE: 4500, + WINDOW_CLOSE: 1500 +}; + if (getCurrentContext() === "popup") { setTimeout(() => { isProcessing.value = false; router.go(-2); - }, 4500); + }, TIMEOUT_CONFIG.POPUP_CLOSE); } else { setTimeout(() => { isProcessing.value = false; window.close(); - }, 1500); + }, TIMEOUT_CONFIG.WINDOW_CLOSE); }
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
yarn.lockis excluded by!**/yarn.lock,!**/*.lock
📒 Files selected for processing (7)
packages/extension/src/providers/bitcoin/networks/firo-testnet.ts(1 hunks)packages/extension/src/providers/bitcoin/networks/firo.ts(1 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue(1 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue(1 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue(1 hunks)packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue(1 hunks)packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- packages/extension/src/providers/bitcoin/networks/firo.ts
- packages/extension/src/providers/bitcoin/networks/firo-testnet.ts
🔇 Additional comments (1)
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue (1)
444-476: Verify Spark address transaction security
The sendSparkAction function needs additional security checks before initiating the transaction:
- Validate the address format
- Check for sufficient balance including fees
- Implement rate limiting to prevent spam
.../extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue
Outdated
Show resolved
Hide resolved
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue
Show resolved
Hide resolved
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue
Show resolved
Hide resolved
packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue
Outdated
Show resolved
Hide resolved
packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue
Outdated
Show resolved
Hide resolved
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue
Show resolved
Hide resolved
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue
Show resolved
Hide resolved
403bc8a to
aa670f8
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 7
♻️ Duplicate comments (1)
packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue (1)
188-205:⚠️ Potential issueImprove error handling and message sanitization
The error handling exposes raw error messages to the UI, which could leak sensitive information.
🧹 Nitpick comments (15)
packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue (3)
176-185: Extract magic numbers into named constantsHardcoded timeout values should be extracted into named constants for better maintainability.
+const POPUP_CLOSE_DELAY = 4500; +const WINDOW_CLOSE_DELAY = 1500; + if (getCurrentContext() === "popup") { setTimeout(() => { isProcessing.value = false; router.go(-2); - }, 4500); + }, POPUP_CLOSE_DELAY); } else { setTimeout(() => { isProcessing.value = false; window.close(); - }, 1500); + }, WINDOW_CLOSE_DELAY); }
132-149: Enhance transaction activity initializationThe transaction activity initialization could be more robust with validation and default values.
+const createTransactionActivity = ( + params: VerifyTransactionParams, + network: BitcoinNetwork +): Activity => ({ from: network.value.displayAddress(txData.fromAddress), to: txData.toAddress, isIncoming: txData.fromAddress === txData.toAddress, network: network.value.name, status: ActivityStatus.pending, timestamp: new Date().getTime(), token: { decimals: txData.toToken.decimals, icon: txData.toToken.icon, name: txData.toToken.name, symbol: txData.toToken.symbol, price: txData.toToken.price, }, type: ActivityType.transaction, value: txData.toToken.amount, transactionHash: "", +}); + +const txActivity = createTransactionActivity(txData, network.value);
332-333: Extract magic numbers in CSS calculationsThe width and height calculations use magic numbers that should be extracted into LESS variables for better maintainability.
+@scroll-offset: 53px; +@button-height: 88px; + -width: calc(~"100% + 53px"); -height: calc(~"100% - 88px"); +width: calc(~"100% + @{scroll-offset}"); +height: calc(~"100% - @{button-height}");packages/extension/src/libs/spark-handler/callRPC.ts (3)
5-8: Improve type safety with proper interfacesThe current implementation uses
anyas the default type parameter and lacks proper interfaces for RPC requests and responses.Consider adding these type definitions:
interface RPCRequest { jsonrpc: string; id: string; method: string; params: unknown[]; } interface RPCResponse<T> { jsonrpc: string; id: string; result?: T; error?: { code: number; message: string; }; } // Then update the function signature: export async function callRPC<T>( method: keyof typeof RPC_METHODS, // Define allowed methods params?: object ): Promise<T> {
29-32: Improve error handling with specific error typesThe current error handling is too generic and might log sensitive information.
Consider implementing specific error types:
class RPCError extends Error { constructor( message: string, public code?: number, public method?: string ) { super(message); this.name = 'RPCError'; } } // Then update the error handling: } catch (error) { if (axios.isAxiosError(error)) { throw new RPCError( `Network error during RPC call to ${method}`, error.response?.status ); } if (error instanceof Error) { throw new RPCError(error.message); } throw new RPCError('Unknown error during RPC call'); }
1-33: Add security measures for RPC communicationThe implementation makes external HTTP requests without rate limiting or explicit SSL/TLS verification settings.
Consider these security improvements:
- Implement rate limiting to prevent DoS
- Explicitly enable SSL/TLS verification in axios
- Add retry mechanism with exponential backoff for failed requests
- Consider using a circuit breaker pattern for RPC calls
Example implementation:
import rateLimit from 'axios-rate-limit'; import axiosRetry from 'axios-retry'; const axiosInstance = rateLimit( axios.create({ timeout: DEFAULT_TIMEOUT, headers: { "Content-Type": "application/json" }, httpsAgent: new https.Agent({ rejectUnauthorized: true }) }), { maxRequests: 100, perMilliseconds: 1000 } ); axiosRetry(axiosInstance, { retries: 3, retryDelay: axiosRetry.exponentialDelay, retryCondition: (error) => { return axiosRetry.isNetworkOrIdempotentRequestError(error) || error.code === 'ECONNABORTED'; } });packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue (2)
25-56: Enhance accessibility for address input sectionsThe form inputs lack proper ARIA attributes for better screen reader support. Consider adding appropriate ARIA labels and descriptions.
<div class="form__container-send-input"> <send-spark-address-input v-if="network.name === 'Firo'" ref="addressInputTo" class="no-margin" title="To address" :value="addressTo" :network="(network as BitcoinNetwork)" + aria-label="Recipient Spark address input" + aria-required="true" @update:input-address="inputAddressTo" @toggle:show-contacts="toggleSelectContactTo" /> <send-address-input v-if="network.name !== 'Firo'" ref="addressInputTo" class="no-margin" title="To address" :value="addressTo" :network="(network as BitcoinNetwork)" + aria-label="Recipient address input" + aria-required="true" @update:input-address="inputAddressTo" @toggle:show-contacts="toggleSelectContactTo" />
182-184: Consider moving browser polyfill to a utility moduleThe Browser polyfill import could be moved to a separate utility module to improve code organization and reusability.
Create a new utility file
browser-utils.ts:import Browser from "webextension-polyfill"; export const createPopupWindow = async (url: string, options = {}) => { return Browser.windows.create({ url: Browser.runtime.getURL(url), type: "popup", focused: true, height: 600, width: 460, ...options, }); };packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue (2)
43-53: Enhance clipboard error handling and user feedback.The clipboard operation should provide user feedback and handle permissions.
const pasteFromClipboard = async () => { try { const text = await navigator.clipboard.readText(); if (addressInput.value) { addressInput.value?.focus() emit("update:inputAddress", text); } } catch (err) { - console.error("Failed to read clipboard:", err); + const errorMessage = err instanceof Error ? err.message : 'Unknown error'; + console.error("Failed to read clipboard:", errorMessage); + // Emit error event for UI feedback + emit("clipboard:error", errorMessage.includes('denied') ? + 'Please allow clipboard access' : + 'Failed to read from clipboard'); } };
142-154: Enhance input accessibility.The input field could benefit from ARIA attributes and focus styles for better accessibility.
input { width: 290px; height: 24px; font-style: normal; font-weight: 400; font-size: 16px; line-height: 24px; letter-spacing: 0.25px; color: @primaryLabel; border: 0 none; outline: none; padding: 0; + &:focus-visible { + outline: 2px solid @primary; + outline-offset: 2px; + } }packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue (1)
141-166: Simplify transaction validation logic.The validation logic combines multiple concerns and could be split into separate validation functions for better maintainability.
+const isValidAddress = computed(() => + isSparkAddress(addressTo.value) || + isAddress(addressTo.value, props.network.networkInfo) +); + +const isValidAmount = computed(() => + props.isSendToken && + isValidDecimals(sendAmount.value, props.network.decimals!) +); + +const isAboveDust = computed(() => + !(Number(sendAmount.value) < (props.network as BitcoinNetwork).dust && + props.isSendToken) +); + +const hasEnoughBalance = computed(() => + !new BigNumber(sendAmount.value).gt( + props.sparkAccount.sparkBalance.availableBalance + ) +); + const isInputsValid = computed<boolean>(() => { - if ( - !isSparkAddress(addressTo.value) && - !isAddress(addressTo.value, props.network.networkInfo) - ) - return false; - - if ( - props.isSendToken && - !isValidDecimals(sendAmount.value, props.network.decimals!) - ) { - return false; - } - if ( - Number(sendAmount.value) < (props.network as BitcoinNetwork).dust && - props.isSendToken - ) - return false; - if ( - new BigNumber(sendAmount.value).gt( - props.sparkAccount.sparkBalance.availableBalance - ) - ) - return false; - return true; + return isValidAddress.value && + isValidAmount.value && + isAboveDust.value && + hasEnoughBalance.value; });packages/extension/src/providers/bitcoin/libs/api-firo.ts (4)
14-14: Remove or implement the unutilized init function.
The async init() method is declared but not implemented. If no initialization logic is required, consider removing the method to avoid confusion. Otherwise, implement the intended setup routines.
34-42: Leverage async/await for cleaner error handling.
Using .then() chains can be more cumbersome to maintain. Adopting async/await improves readability and makes try/catch blocks more straightforward for error handling.- return fetch(`${this.node}/insight-api-zcoin/rawtx/${hash}`) - .then((res) => res.json()) - .then((tx: { hex: string; error: unknown }) => { - if ((tx as any).error) return null; - if (!tx.hex) return null; - return `0x${tx.hex}`; - }); + try { + const res = await fetch(`${this.node}/insight-api-zcoin/rawtx/${hash}`); + const tx: { hex: string; error?: unknown } = await res.json(); + if (tx.error || !tx.hex) return null; + return `0x${tx.hex}`; + } catch { + return null; + }
53-60: Adopt optional chaining to reduce logical checks.
Instead of explicitly verifying if t.addresses or t.scriptPubKey.addresses exist, you can use optional chaining for cleaner code.- .filter((t) => t.addresses && t.addresses.length) + .filter((t) => t.addresses?.length) - .filter( - (t) => t.scriptPubKey.addresses && t.scriptPubKey.addresses.length - ) + .filter( + (t) => t.scriptPubKey.addresses?.length + )🧰 Tools
🪛 Biome (1.9.4)
[error] 53-53: Change to an optional chain.
Unsafe fix: Change to an optional chain.
(lint/complexity/useOptionalChain)
[error] 60-60: Change to an optional chain.
Unsafe fix: Change to an optional chain.
(lint/complexity/useOptionalChain)
74-87: Unify error handling for safer parsing.
The code checks balance.message in an any-casted object. Consider defining a refined type or adding a type guard to differentiate successful results from error objects, removing the need for casting.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
yarn.lockis excluded by!**/yarn.lock,!**/*.lock
📒 Files selected for processing (9)
packages/extension/src/libs/spark-handler/callRPC.ts(1 hunks)packages/extension/src/providers/bitcoin/libs/api-firo.ts(1 hunks)packages/extension/src/providers/bitcoin/networks/firo-testnet.ts(1 hunks)packages/extension/src/providers/bitcoin/networks/firo.ts(1 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue(1 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue(1 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue(1 hunks)packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue(1 hunks)packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- packages/extension/src/providers/bitcoin/networks/firo-testnet.ts
- packages/extension/src/providers/bitcoin/networks/firo.ts
🧰 Additional context used
🪛 Biome (1.9.4)
packages/extension/src/providers/bitcoin/libs/api-firo.ts
[error] 53-53: Change to an optional chain.
Unsafe fix: Change to an optional chain.
(lint/complexity/useOptionalChain)
[error] 60-60: Change to an optional chain.
Unsafe fix: Change to an optional chain.
(lint/complexity/useOptionalChain)
🔇 Additional comments (9)
packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue (1)
114-117: Verify network initialization and error handling
The network initialization is critical for Spark transactions. Ensure proper error handling when network is not found.
✅ Verification successful
Based on the search results, I can now generate the final response since I have enough information about the network initialization and error handling:
Network initialization is properly handled
The code follows the established pattern used throughout the codebase for network initialization:
- Non-null assertion is safe here as this component is only rendered after network selection
- Error cases are handled in the parent App.vue which provides fallback to default network
- The type assertion to BitcoinNetwork is valid as this component specifically handles Bitcoin/Spark transactions
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify network initialization and related components
# Check for network-related error handling
rg -A 5 "getNetworkByName"
# Check for network validation in Spark-related components
rg -A 10 "BitcoinNetwork.*Spark"
# Check for proper network type validation
ast-grep --pattern 'network.value = $_ as BitcoinNetwork'
Length of output: 47175
packages/extension/src/libs/spark-handler/callRPC.ts (1)
25-28: Response validation looks good!
The implementation correctly validates the RPC response structure before accessing the result.
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue (3)
261-270: Previous review comment about fee calculation safety checks is still valid
The fee calculation still lacks proper error handling and safety checks.
577-662: LGTM! Styles are well-organized
The styles follow good practices and are properly scoped.
311-334:
Add comprehensive input validation
The current validation lacks checks for:
- Maximum transaction amount
- Network-specific address format validation
- Sufficient balance including fees
const isInputsValid = computed<boolean>(() => {
+ // Check maximum transaction limits
+ const maxTxLimit = new BigNumber(1000000); // Network specific limit
+ if (new BigNumber(sendAmount.value).gt(maxTxLimit)) {
+ return false;
+ }
+
+ // Validate sufficient balance including fees
+ const totalRequired = new BigNumber(sendAmount.value).plus(
+ gasCostValues.value[selectedFee.value].nativeValue
+ );
+ if (totalRequired.gt(UTXOBalance.value.toString())) {
+ return false;
+ }
+
// Rest of the validation...
return true;
});Likely invalid or redundant comment.
packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue (1)
1-32: LGTM! Clean template structure with proper validation feedback.
The template provides clear visual feedback through avatar display and color changes based on address validity.
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue (1)
198-228:
Add rate limiting and error handling to send action.
The send action should be protected against multiple rapid clicks and include proper error handling.
+import { debounce } from 'lodash';
+const isSubmitting = ref(false);
-const sendAction = async () => {
+const sendAction = debounce(async () => {
+ if (isSubmitting.value) return;
+ isSubmitting.value = true;
const fromAccountAddress = props.sparkAccount.allAddresses.at(-1)!;
const toAmount = toBN(toBase(sendAmount.value, props.network.decimals));
+ try {
router.push({
name: RouterNames.verifySendFromSpark.name,
query: {
id: selected,
txData: Buffer.from(
JSON.stringify({
toToken: {
amount: toAmount.toString(),
decimals: availableAsset.value.decimals!,
icon: availableAsset.value.icon as string,
symbol: availableAsset.value.symbol,
valueUSD: new BigNumber(availableAsset.value.price || "0")
.times(sendAmount.value)
.toString(),
name: availableAsset.value.name,
price: availableAsset.value.price || "0",
},
fromAddress: fromAccountAddress,
fromAddressName: "Spark Account",
toAddress: addressTo.value,
}),
"utf8"
).toString("base64"),
},
});
+ } catch (error) {
+ console.error('Failed to initiate transaction:', error);
+ // Emit error for UI feedback
+ emit('send:error', 'Failed to initiate transaction');
+ } finally {
+ isSubmitting.value = false;
+ }
-};
+}, 1000, { leading: true, trailing: false });Likely invalid or redundant comment.
packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue (1)
191-195:
Improve error message handling for better security.
Direct stringification of error responses could expose sensitive information. Implement proper error mapping.
+const mapErrorToUserMessage = (error: unknown): string => {
+ if (isAxiosError(error)) {
+ const message = error.response?.data.error.message;
+ // Map known error messages to user-friendly messages
+ if (message?.includes('insufficient funds')) {
+ return 'Insufficient funds to complete the transaction';
+ }
+ if (message?.includes('nonce')) {
+ return 'Transaction rejected: Please try again';
+ }
+ return 'Failed to send transaction. Please try again later';
+ }
+ return 'An unexpected error occurred';
+};
-if (isAxiosError(error)) {
- errorMsg.value = JSON.stringify(error.response?.data.error.message);
-} else {
- errorMsg.value = JSON.stringify(error);
-}
+errorMsg.value = mapErrorToUserMessage(error);Likely invalid or redundant comment.
packages/extension/src/providers/bitcoin/libs/api-firo.ts (1)
139-156: Returning only the last UTXO is a critical limitation.
This code currently only passes the final element in the UTXO array to filterOutOrdinals. If the intent is to handle multiple UTXOs, you should return the entire array.
[critical_issue, duplicate_comment]
- [(await this.FiroToHaskoinUTXOs(utxos, address)).at(-1)!]
+ await this.FiroToHaskoinUTXOs(utxos, address)
packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue
Show resolved
Hide resolved
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue
Show resolved
Hide resolved
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue
Show resolved
Hide resolved
packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue
Outdated
Show resolved
Hide resolved
c47ef46 to
f9d1993
Compare
f9d1993 to
643b114
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 7
♻️ Duplicate comments (4)
packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue (2)
33-33:⚠️ Potential issueSanitize error messages before display
Direct interpolation of error messages in the template could expose sensitive information or be vulnerable to XSS attacks.
Replace the direct interpolation with a sanitized message display:
-{{ errorMsg }} +<span class="error-message">{{ sanitizedErrorMessage }}</span>Add the following computed property to the script section:
const sanitizedErrorMessage = computed(() => { if (!errorMsg.value) return ''; // Basic sanitization and truncation return errorMsg.value .toString() .replace(/[^\w\s-]/g, '') .substring(0, 100); });
188-205:⚠️ Potential issueImprove error handling in catch block
The current error handling directly stringifies the error response or error object, which might expose sensitive information or result in unclear messages to users.
Implement proper error mapping:
.catch((error) => { isProcessing.value = false; - if (isAxiosError(error)) { - errorMsg.value = JSON.stringify(error.response?.data.error.message); - } else { - errorMsg.value = JSON.stringify(error); - } + const getErrorMessage = (error: unknown): string => { + if (isAxiosError(error)) { + const message = error.response?.data.error.message; + return typeof message === 'string' + ? message.replace(/[^\w\s-]/g, '') // Basic sanitization + : 'Network error occurred'; + } + return 'Transaction failed'; + }; + errorMsg.value = getErrorMessage(error); trackSendEvents(SendEventType.SendFailed, { network: network.value.name, error: error.message, });packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue (1)
198-228:⚠️ Potential issueAdd rate limiting and loading state to prevent duplicate transactions.
The
sendActionmethod lacks protection against rapid multiple clicks, which could lead to duplicate transactions.Apply this diff to add debouncing and loading state:
+import { debounce } from 'lodash'; +const isSubmitting = ref(false); -const sendAction = async () => { +const sendAction = debounce(async () => { + if (isSubmitting.value) return; + isSubmitting.value = true; try { const fromAccountAddress = props.sparkAccount.allAddresses.at(-1)!; const toAmount = toBN(toBase(sendAmount.value, props.network.decimals)); await router.push({ name: RouterNames.verifySendFromSpark.name, query: { id: selected, txData: Buffer.from( JSON.stringify({ toToken: { amount: toAmount.toString(), decimals: availableAsset.value.decimals!, icon: availableAsset.value.icon as string, symbol: availableAsset.value.symbol, valueUSD: new BigNumber(availableAsset.value.price || "0") .times(sendAmount.value) .toString(), name: availableAsset.value.name, price: availableAsset.value.price || "0", }, fromAddress: fromAccountAddress, fromAddressName: "Spark Account", toAddress: addressTo.value, }), "utf8" ).toString("base64"), }, }); + } catch (error) { + console.error('Failed to send transaction:', error); + } finally { + isSubmitting.value = false; + } -}; +}, 1000, { leading: true, trailing: false });packages/extension/src/providers/bitcoin/libs/api-firo.ts (1)
107-137:⚠️ Potential issueAdd error handling for API calls in FiroToHaskoinUTXOs method.
The method makes multiple API calls without proper error handling.
async FiroToHaskoinUTXOs( FiroUTXOs: FiroUnspentType[], address: string ): Promise<HaskoinUnspentType[]> { const ret: HaskoinUnspentType[] = []; + try { for (const utx of FiroUTXOs) { - const rawTxRes = (await cacheFetch({ - url: `${this.node}/insight-api-zcoin/rawtx/${utx.txid}`, - })) as { rawtx: string }; - const res = (await cacheFetch({ - url: `${this.node}/insight-api-zcoin/tx/${utx.txid}`, - })) as FiroTxType; + try { + const rawTxRes = (await cacheFetch({ + url: `${this.node}/insight-api-zcoin/rawtx/${utx.txid}`, + })) as { rawtx: string }; + const res = (await cacheFetch({ + url: `${this.node}/insight-api-zcoin/tx/${utx.txid}`, + })) as FiroTxType; - ret.push({ - address, - block: { - height: res.blockheight, - position: 0, - }, - index: utx.vout, - pkscript: res.vout[utx.vout].scriptPubKey.hex, - txid: utx.txid, - value: Number(utx.satoshis), - raw: rawTxRes.rawtx, - }); + ret.push({ + address, + block: { + height: res.blockheight, + position: 0, + }, + index: utx.vout, + pkscript: res.vout[utx.vout].scriptPubKey.hex, + txid: utx.txid, + value: Number(utx.satoshis), + raw: rawTxRes.rawtx, + }); + } catch (error) { + console.error(`Failed to process UTXO ${utx.txid}:`, error); + // Skip this UTXO and continue with others + continue; + } } ret.sort((a, b) => { return a.value - b.value; }); return ret; + } catch (error) { + console.error('Failed to process UTXOs:', error); + throw error; + } }
🧹 Nitpick comments (6)
packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue (1)
176-185: Add cleanup for timeoutsTimeouts should be cleaned up when the component is unmounted to prevent memory leaks.
Add timeout cleanup:
+const timeoutRef = ref<number>(); if (getCurrentContext() === "popup") { - setTimeout(() => { + timeoutRef.value = window.setTimeout(() => { isProcessing.value = false; router.go(-2); }, 4500); } else { - setTimeout(() => { + timeoutRef.value = window.setTimeout(() => { isProcessing.value = false; window.close(); }, 1500); } +onBeforeUnmount(() => { + if (timeoutRef.value) { + window.clearTimeout(timeoutRef.value); + } +});packages/extension/src/libs/spark-handler/callRPC.ts (1)
40-43: Consider adding specific error handlingThe current error handling is generic. Consider adding specific handling for common RPC errors.
Example implementation:
} catch (error) { + if (axios.isAxiosError(error)) { + if (error.response?.status === 401) { + throw new Error('Unauthorized: Check RPC credentials'); + } + if (error.code === 'ECONNABORTED') { + throw new Error('RPC request timed out'); + } + } console.error("RPC Error:", error); throw error; }packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue (2)
4-11: Add alt text for accessibilityThe avatar image lacks meaningful alt text for screen readers.
<img v-if=" isSparkAddress(btcAddress) || isAddress(btcAddress, props.network.networkInfo) " :src="network.identicon(btcAddress)" - alt="" + :alt="`${btcAddress} identicon`" />
15-30: Consider adding input validation feedbackThe input validation only shows red text color for invalid addresses. Consider adding a tooltip or helper text to explain why the address is invalid.
<input ref="addressInput" v-model="address" type="text" :disabled="disableDirectInput" placeholder="address" + :aria-invalid="!isSparkAddress(btcAddress) && !isAddress(btcAddress, props.network.networkInfo)" :style="{ color: !isSparkAddress(btcAddress) && !isAddress(btcAddress, props.network.networkInfo) ? 'red' : 'black', }" @focus="changeFocus" @blur="changeFocus" /> + <span v-if="btcAddress && !isSparkAddress(btcAddress) && !isAddress(btcAddress, props.network.networkInfo)" + class="invalid-feedback"> + Invalid address format + </span>packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue (1)
87-89: Consider adding analytics for tab switchesSince the PR adds new UI interactions, consider tracking tab switches for analytics.
const setSelectedSendTab = (activeTab: "transparent" | "spark") => { selectedSendTab.value = activeTab; + trackSendEvents(SendEventType.SendTabSwitch, { + network: props.network.name, + tab: activeTab + }); };packages/extension/src/providers/bitcoin/libs/api-firo.ts (1)
31-32: Implement or document the empty init() method.The
init()method is currently empty. If it's intended to be implemented by consumers, consider adding a comment explaining its purpose or marking it as optional.- async init(): Promise<void> {} + /** + * Optional initialization method for API setup. + * Override this method if additional setup is needed. + */ + async init(): Promise<void> {}
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
yarn.lockis excluded by!**/yarn.lock,!**/*.lock
📒 Files selected for processing (14)
packages/extension/src/libs/spark-handler/callRPC.ts(1 hunks)packages/extension/src/libs/spark-handler/index.ts(1 hunks)packages/extension/src/providers/bitcoin/libs/api-firo.ts(1 hunks)packages/extension/src/providers/bitcoin/networks/firo-testnet.ts(1 hunks)packages/extension/src/providers/bitcoin/networks/firo.ts(1 hunks)packages/extension/src/providers/bitcoin/types/bitcoin-network.ts(2 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue(1 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue(4 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue(1 hunks)packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue(1 hunks)packages/extension/src/ui/action/App.vue(7 hunks)packages/extension/src/ui/action/router/index.ts(2 hunks)packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue(1 hunks)packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (5)
- packages/extension/src/providers/bitcoin/networks/firo-testnet.ts
- packages/extension/src/libs/spark-handler/index.ts
- packages/extension/src/providers/bitcoin/types/bitcoin-network.ts
- packages/extension/src/ui/action/router/index.ts
- packages/extension/src/providers/bitcoin/networks/firo.ts
🧰 Additional context used
🪛 Biome (1.9.4)
packages/extension/src/providers/bitcoin/libs/api-firo.ts
[error] 53-53: Change to an optional chain.
Unsafe fix: Change to an optional chain.
(lint/complexity/useOptionalChain)
[error] 60-60: Change to an optional chain.
Unsafe fix: Change to an optional chain.
(lint/complexity/useOptionalChain)
🔇 Additional comments (15)
packages/extension/src/libs/spark-handler/callRPC.ts (3)
36-39: LGTM: Response validation looks good
The implementation properly validates the RPC response structure before accessing the result.
5-7:
Add testnet URL configuration
The PR objectives mention support for both Mainnet and Testnet environments, but only the mainnet URL is configured.
Add testnet URL:
const RPC_URLS = {
mainnet: "https://firo-rpc.publicnode.com/",
+ testnet: "https://testnet-firo-rpc.publicnode.com/", // Replace with actual testnet URL
};Likely invalid or redundant comment.
21-34: 🛠️ Refactor suggestion
Update JSON-RPC implementation and remove duplicate headers
Several improvements needed in the RPC request implementation:
- Using outdated JSON-RPC 1.0
- Static request ID could cause issues with concurrent requests
- Headers are defined both in axios instance and request config
Apply these changes:
const response = await axiosInstance.post(
RPC_URLS['mainnet'],
{
- jsonrpc: "1.0",
+ jsonrpc: "2.0",
- id: "js-client",
+ id: crypto.randomUUID(), // Generate unique ID for each request
method: method,
params: params ?? [],
- },
- {
- headers: {
- "Content-Type": "application/json",
- },
}
);Likely invalid or redundant comment.
packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue (1)
43-53: LGTM: Modern Clipboard API implementation
The implementation correctly uses the modern Clipboard API with proper error handling.
packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue (1)
40-45: Verify sparkAccount null check
The component renders the Spark tab conditionally but still passes potentially null sparkAccount.
✅ Verification successful
The null check in the parent component is sufficient
The current implementation is correct for the following reasons:
- The parent component (
index.vue) has a proper guard conditionv-if="selectedSendTab === 'spark' && !!accountInfo.sparkAccount"that ensures the component is only rendered whensparkAccountexists - The
??operator in:spark-account="accountInfo?.sparkAccount ?? null"is redundant but harmless since the v-if condition already guaranteessparkAccountexists - The SparkSendTab component actively uses the sparkAccount prop throughout its implementation (e.g.,
sparkAccount.sparkBalance.availableBalance,sparkAccount.allAddresses) without any null checks because it relies on the parent's guard
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Check if sparkAccount is properly handled in SparkSendTab
ast-grep --pattern 'sparkAccount: {
type: $_,
required: true
}'
Length of output: 70
Script:
#!/bin/bash
# Let's try to find the SparkSendTab component definition and check its props
fd "SparkSendTab.vue" --exec cat {}
Length of output: 37
Script:
#!/bin/bash
# Let's try a broader search for the component file
fd -e vue --exec grep -l "SparkSendTab" {}
Length of output: 117
Script:
#!/bin/bash
# Let's check the SparkSendTab component import and its usage
rg -A 5 "SparkSendTab" ./packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue
Length of output: 287
Script:
#!/bin/bash
# Let's examine the spark-send-tab.vue component
cat ./packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue
Length of output: 8340
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue (3)
1-62: LGTM! Well-structured template with proper user feedback.
The template provides a clear user interface with:
- Proper input validation through disabled states
- Clear feedback through alerts
- Good separation of concerns between components
189-196: Improve precision handling in amount input.
The amount input handling could lead to precision loss when converting between string and number.
Apply this diff to use BigNumber consistently:
const inputAmount = (inputAmount: string) => {
if (inputAmount === "") {
inputAmount = "0";
}
const inputAmountBn = new BigNumber(inputAmount);
isMaxSelected.value = false;
- amount.value = inputAmountBn.lt(0) ? "0" : inputAmount;
+ amount.value = inputAmountBn.lt(0) ? "0" : inputAmountBn.toString();
};268-302: LGTM! Clean and well-organized styles.
The styles follow good practices with:
- Proper use of LESS
- Clear class naming
- Good organization of layout styles
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue (4)
189-202: 🛠️ Refactor suggestion
Strengthen props validation to prevent runtime errors.
The props lack proper validation which could lead to runtime errors. Consider adding required flags and type validation.
const props = defineProps({
network: {
type: Object as PropType<BitcoinNetwork>,
- default: () => ({}),
+ required: true,
+ validator: (value: BitcoinNetwork) => {
+ return Boolean(value.name && value.decimals && value.networkInfo);
+ }
},
accountInfo: {
type: Object as PropType<AccountsHeaderData>,
- default: () => ({}),
+ required: true,
+ validator: (value: AccountsHeaderData) => {
+ return Boolean(value.selectedAccount);
+ }
},
isSendToken: {
type: Boolean,
default: () => false,
},
});Likely invalid or redundant comment.
311-334: 🛠️ Refactor suggestion
Enhance input validation for better security.
The current validation logic could be improved to prevent transactions with unreasonable fees and ensure proper amount validation.
const isInputsValid = computed<boolean>(() => {
+ // Check for minimum transaction amount
+ const minAmount = new BigNumber(props.network.dust);
+ if (new BigNumber(sendAmount.value).lt(minAmount)) {
+ return false;
+ }
+
+ // Check for maximum gas fee (e.g., 10% of balance)
+ const maxGasFee = new BigNumber(UTXOBalance.value).multipliedBy(0.1);
+ const currentGasFee = new BigNumber(gasCostValues.value[selectedFee.value].nativeValue);
+ if (currentGasFee.gt(maxGasFee)) {
+ return false;
+ }
+
if (
!isSparkAddress(addressTo.value) &&
!isAddress(addressTo.value, (props.network as BitcoinNetwork).networkInfo)
) {
return false;
}
// ... rest of the validation logic
});Likely invalid or redundant comment.
261-289: 🛠️ Refactor suggestion
Add error handling for fee calculation and UTXO updates.
The fee calculation and UTXO update logic lack proper error handling which could lead to silent failures.
const setTransactionFees = (byteSize: number) => {
+ if (byteSize <= 0) {
+ console.error('Invalid byte size for fee calculation');
+ return;
+ }
const nativeVal = selectedAsset.value.price || "0";
- getGasCostValues(
+ return getGasCostValues(
props.network as BitcoinNetwork,
byteSize,
nativeVal,
props.network.decimals,
props.network.currencyName
- ).then((val) => (gasCostValues.value = val));
+ ).then((val) => (gasCostValues.value = val))
+ .catch((error) => {
+ console.error('Failed to calculate gas costs:', error);
+ gasCostValues.value = defaultGasCostVals;
+ });
};
const updateUTXOs = async () => {
const api = (await props.network.api()) as BitcoinAPI;
- return api.getUTXOs(addressFrom.value).then((utxos) => {
+ try {
+ const utxos = await api.getUTXOs(addressFrom.value);
accountUTXOs.value = utxos;
const txSize = calculateSizeBasedOnType(
accountUTXOs.value.length + (props.isSendToken ? 0 : 1),
2,
(props.network as BitcoinNetwork).networkInfo.paymentType
);
setTransactionFees(Math.ceil(txSize));
- });
+ } catch (error) {
+ console.error('Failed to update UTXOs:', error);
+ accountUTXOs.value = [];
+ }
};Likely invalid or redundant comment.
478-524: 🛠️ Refactor suggestion
Enhance transaction security measures.
The transaction building process needs additional security checks to prevent potential issues with output values and change addresses.
const sendAction = async () => {
+ // Validate total output value doesn't exceed input value
+ const totalOutput = toAmount.add(currentFee);
+ if (totalOutput.gt(UTXOBalance.value)) {
+ throw new Error('Insufficient funds including fees');
+ }
+
// ... existing code ...
if (remainder.gtn(0)) {
+ // Check if remainder is above dust limit
+ if (remainder.lt(props.network.dust)) {
+ throw new Error('Change amount below dust limit');
+ }
+ // Validate change address
+ if (!isAddress(addressFrom.value, props.network.networkInfo)) {
+ throw new Error('Invalid change address');
+ }
txInfo.outputs.push({
address: props.network.displayAddress(addressFrom.value),
value: remainder.toNumber(),
});
}
};Likely invalid or redundant comment.
packages/extension/src/providers/bitcoin/libs/api-firo.ts (1)
74-87: LGTM! Good error handling in getBalance method.
The method properly handles errors and edge cases, including the conversion of balance to string.
packages/extension/src/ui/action/App.vue (2)
323-329: LGTM! Proper initialization of Spark account for Firo network.
The implementation correctly initializes the Spark account state when the Firo network is selected.
287-296: 🛠️ Refactor suggestion
Add concurrency protection to generateNewSparkAddress.
The method lacks protection against concurrent executions, which could lead to race conditions if triggered rapidly.
+let isGeneratingSparkAddress = false;
const generateNewSparkAddress = async () => {
+ if (isGeneratingSparkAddress) return;
+ try {
+ isGeneratingSparkAddress = true;
const newSparkAddressResponse = await generateSparkAddress();
if (accountHeaderData.value.sparkAccount) {
accountHeaderData.value.sparkAccount.defaultAddress =
newSparkAddressResponse;
accountHeaderData.value.sparkAccount.allAddresses.push(
newSparkAddressResponse
);
}
+ } finally {
+ isGeneratingSparkAddress = false;
+ }
};Likely invalid or redundant comment.
packages/extension/src/ui/action/views/verify-send-from-spark-transaction/index.vue
Show resolved
Hide resolved
packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue
Show resolved
Hide resolved
packages/extension/src/ui/action/views/verify-send-to-spark-transaction/index.vue
Outdated
Show resolved
Hide resolved
packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue
Outdated
Show resolved
Hide resolved
d819296 to
ce6ed47
Compare
…ture/add-firo-network # Conflicts: # packages/extension/src/libs/spark-handler/index.ts # packages/extension/src/ui/action/App.vue
…ture/add-firo-network # Conflicts: # packages/extension/src/libs/spark-handler/index.ts # packages/extension/src/ui/action/App.vue
…ture/add-firo-network
…eature/add-firo-network
…ture/add-firo-network # Conflicts: # packages/extension/src/providers/bitcoin/types/bitcoin-network.ts
# Conflicts: # packages/extension/package.json # packages/extension/src/ui/action/App.vue # packages/extension/src/ui/action/views/restricted/index.vue # yarn.lock
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 7
Note
Due to the large number of review comments, Critical severity comments were prioritized as inline comments.
♻️ Duplicate comments (8)
packages/extension/src/providers/bitcoin/types/bitcoin-network.ts (1)
56-59: Replace magic numbers with a named constant or utility function.The length checks
>= 144and< 64are magic numbers that implicitly handle Spark addresses. As noted in a previous review, consider using a dedicatedisSparkAddresscheck or at minimum document these thresholds with constants.Suggested improvement
+const MIN_PUBKEY_LENGTH = 64; +const SPARK_ADDRESS_MIN_LENGTH = 144; + export const getAddress = (pubkey: string, network: BitcoinNetworkInfo) => { - if (pubkey.length >= 144 || pubkey.length < 64) { + // Return as-is if already an address (short) or a Spark address (long) + if (pubkey.length >= SPARK_ADDRESS_MIN_LENGTH || pubkey.length < MIN_PUBKEY_LENGTH) { return pubkey; }packages/extension/src/providers/bitcoin/types/index.ts (1)
114-126: Naming inconsistency:blockheightvsblockHeight.
FiroTxTypeusesblockheight(lowercase 'h') whileSSTxTypeusesblockHeight(camelCase). This inconsistency could lead to bugs when developers expect consistent casing. Consider aligning with the API response format or documenting this difference clearly.packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue (1)
187-217: Consider adding rate limiting to prevent duplicate transactions.The send action lacks rate limiting, which could lead to duplicate transactions if the user clicks multiple times rapidly. Consider adding a loading state or debouncing.
packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue (1)
12-29: Accessibility: Missing keyboard navigation for tabs.The tab implementation from the previous review still lacks proper ARIA attributes and keyboard navigation support as noted in past review comments. If you plan to enable these tabs, ensure accessibility compliance.
Refer to the past review comment for the specific ARIA attributes and keyboard handlers needed.
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue (3)
259-268: Consider adding error handling for fee calculation.The promise chain lacks error handling. If
getGasCostValuesfails,gasCostValues.valuewon't be updated and will retain stale data, potentially causing incorrect fee display.
183-196: Props should be marked as required.The
networkandaccountInfoprops use empty object defaults, which could lead to runtime errors if the parent component doesn't provide them. These appear to be essential for the component to function.
524-530: Missing dust threshold check for change output.When the remainder is greater than 0, it's added as a change output without checking if it exceeds the dust threshold. A remainder below the dust limit would create an unspendable output and waste fees.
🔎 Proposed fix
const remainder = UTXOBalance.value.sub(toAmount).sub(currentFee); -if (remainder.gtn(0)) { +const dustThreshold = (props.network as BitcoinNetwork).dust; +if (remainder.gtn(dustThreshold)) { txInfo.outputs.push({ address: props.network.displayAddress(addressFrom.value), value: remainder.toNumber(), }); +} else if (remainder.gtn(0)) { + // Remainder below dust - add to fee instead of creating unspendable output + // Fee is already deducted, no action needed }packages/extension/src/providers/bitcoin/libs/api-firo.ts (1)
29-37: Add error handling for network failures.The fetch chain doesn't handle network errors. If the request fails, the promise will reject without graceful handling.
🟠 Major comments (25)
packages/extension/src/libs/utils/number-formatter.ts-316-317 (1)
316-317: Replace@ts-ignorewith a compatible string method.Using
@ts-ignoresuppresses all TypeScript errors on the following line, which can hide legitimate type issues. ThereplaceAllmethod may not be recognized depending on your TypeScript target configuration.🔎 Recommended fix using `.replace()` with global regex
- // @ts-ignore - const value = new BigNumber(_value.toString().replaceAll('%', '')); + const value = new BigNumber(_value.toString().replace(/%/g, ''));This approach is more widely compatible and eliminates the need for type suppression.
packages/extension/src/providers/bitcoin/libs/firo-wallet/utils.ts-1-6 (1)
1-6: Use generics for type safety and align callback signature with implementation.The function signature has two issues:
- Using
anytypes reduces type safety. Consider using a generic type parameter.- The callback signature declares an optional
indexparameter, but Line 11 never passes it. Either pass the index or remove it from the signature.🔎 Proposed refactor with generics and consistent callback signature
-export const chunkedEvery = ( - data: any[], +export const chunkedEvery = <T>( + data: T[], chunksize: number, - callback: (data: any, index?: number) => void, + callback: (data: T, index: number) => void, finished: () => void, ) => {Then update Line 11 to pass the index:
- callback(data[i]); + callback(data[i], i);Committable suggestion skipped: line range outside the PR's diff.
packages/extension/src/providers/bitcoin/libs/firo-wallet/utils.ts-7-18 (1)
7-18: Add error handling and input validation.The implementation has two major concerns:
- No error handling: If the callback throws an error, processing halts and the
finishedcallback is never invoked, leaving operations incomplete.- No input validation: Invalid inputs (e.g.,
chunksize <= 0,nulldata) can cause runtime errors or infinite loops.🔎 Proposed fix with error handling and validation
export const chunkedEvery = <T>( data: T[], chunksize: number, callback: (data: T, index: number) => void, finished: () => void, ) => { + if (!data || !Array.isArray(data)) { + throw new Error('data must be a non-null array'); + } + if (chunksize <= 0) { + throw new Error('chunksize must be positive'); + } + if (typeof callback !== 'function' || typeof finished !== 'function') { + throw new Error('callback and finished must be functions'); + } + let i = 0; (function chunk() { const end = Math.min(i + chunksize, data.length); for (; i < end; ++i) { - callback(data[i], i); + try { + callback(data[i], i); + } catch (error) { + console.error('Error in chunkedEvery callback:', error); + // Continue processing remaining items or rethrow based on requirements + } } if (i < data.length) { setTimeout(chunk, 0); } else { - finished.call(null); + finished(); } })(); };Note: The error handling strategy (continue vs. abort) should align with your use case requirements.
Committable suggestion skipped: line range outside the PR's diff.
packages/extension/src/libs/utils/wasm-loader.ts-46-68 (1)
46-68: Race condition: concurrentgetInstance()calls may load WASM multiple times.If
getInstance()is called multiple times before the first load completes (and beforethis.instanceis set), each call will trigger a separateloadWasm(). Consider using a pending promise to deduplicate concurrent requests.🔎 Proposed fix
class WasmInstance { instance: WasmModule | null = null; + private loadingPromise: Promise<WasmModule> | null = null; constructor() { - loadWasm() - .then(module => { - this.instance = module; - }) - .catch(error => { - console.error('Error loading WASM module:', error); - }); + // Eagerly start loading + this.loadingPromise = loadWasm() + .then(module => { + this.instance = module; + return module; + }) + .catch(error => { + console.error('Error loading WASM module:', error); + this.loadingPromise = null; + throw error; + }); } public async getInstance(): Promise<WasmModule> { if (this.instance) { - return Promise.resolve(this.instance); - } else { - const wasm = await loadWasm(); - this.instance = wasm; - return Promise.resolve(wasm); + return this.instance; } + if (this.loadingPromise) { + return this.loadingPromise; + } + this.loadingPromise = loadWasm().then(module => { + this.instance = module; + return module; + }); + return this.loadingPromise; } }packages/extension/src/libs/background/index.ts-177-181 (1)
177-181: Wrap switch case declarations in a block and handle unlock failure gracefully.Static analysis correctly flags that
constdeclarations in switch cases can leak to other cases. Additionally, ifunlock()fails aftersetSecret()is called, the wallet will have the mnemonic set but the keyring will remain locked, causing an inconsistent state.🔎 Proposed fix
case InternalMethods.unlock: - const password = message?.params?.[0] as string; - const mnemonic = await this.#keyring.getSavedMnemonic(password); - this.#wallet.setSecret(mnemonic); - return unlock(this.#keyring, message); + { + const password = message?.params?.[0] as string; + const result = await unlock(this.#keyring, message); + if (!result.error) { + const mnemonic = await this.#keyring.getSavedMnemonic(password); + this.#wallet.setSecret(mnemonic); + } + return result; + }Committable suggestion skipped: line range outside the PR's diff.
packages/extension/src/libs/spark-handler/getFee.ts-6-11 (1)
6-11: Missing error handling for fee estimation failure.If
firoElectrum.estimateFee(2)fails or returns an unexpected value (e.g.,-1which some Electrum servers return when fee estimation is unavailable), the calculation will produce invalid results.🔎 Proposed fix with fallback
export const getFee = async (tempTx: bitcoin.Transaction) => { const feeRate = (await firoElectrum.estimateFee(2)) as number; + + if (feeRate <= 0) { + throw new Error('Unable to estimate fee rate'); + } return new BigNumber(feeRate) .times(Math.ceil(tempTx.virtualSize() / 1000)) .times(SATOSHI); };packages/extension/src/providers/bitcoin/libs/firo-wallet/firo-wallet.ts-99-146 (1)
99-146: Null safety check required before destructuring storage result.Line 109 has a potential null reference error. The
storage.get()method returnsnullif the key doesn't exist, and destructuringconst { secret }fromnullwill throw a TypeError. Add an explicit null check before destructuring:const { secret } = await this.#storage.get( configs.STORAGE_KEYS.FIRO_WALLET_SECRET, ); +if (!secret) { + throw new Error('FiroWallet not initialized'); +} -const seed = bip39.mnemonicToSeedSync(secret!); +const seed = bip39.mnemonicToSeedSync(secret);The derivation logic assumes a strict input order that is not validated. The algorithm correctly derives keys when
getTransactionsAddresses()returns addresses split exactly as [external_0, external_1, ..., internal_0, internal_1, ...]. However, this assumption is fragile: if the input order differs or if this function is called with addresses in another sequence, the key mappings will be incorrect. Consider adding a validation or storing derivation metadata (address index and change flag) with each address instead of relying on position-based assumptions.packages/extension/src/libs/spark-handler/createTempTx.ts-62-74 (1)
62-74: Validate thatkeyPairexists for each UTXO address before signing.If
addressKeyPairs[utxo.address]is undefined, the subsequentkeyPair.sign()call will throw. Add a guard to provide a clearer error message.Suggested fix
for (let index = 0; index < spendableUtxos.length; index++) { const utxo = spendableUtxos[index]; const keyPair = addressKeyPairs[utxo.address]; + if (!keyPair) { + throw new Error(`Missing key pair for address: ${utxo.address}`); + } + const Signer = { sign: (hash: Uint8Array) => { return Buffer.from(keyPair.sign(hash)); }, publicKey: Buffer.from(keyPair.publicKey), } as unknown as bitcoin.Signer; tx.signInput(index, Signer); }packages/extension/src/libs/spark-handler/createTempTx.ts-53-60 (1)
53-60: Guard against emptyspendableUtxosarray before accessing first element.If
changeAmount > 0butspendableUtxosis empty, accessingspendableUtxos[0].addresswill throw a runtime error. Add a defensive check.Suggested fix
if (changeAmount.gt(0)) { + if (spendableUtxos.length === 0) { + throw new Error('Cannot add change output: no spendable UTXOs available'); + } const firstUtxoAddress = spendableUtxos[0].address; tx.addOutput({ - address: firstUtxoAddress!, + address: firstUtxoAddress, value: changeAmount.toNumber(), }); }packages/extension/src/libs/spark-handler/createTempFromSparkTx.ts-44-44 (1)
44-44: AvoidparseFloatfor cryptocurrency amounts due to floating-point precision loss.Using
parseFloat(amount)can introduce precision errors with decimal values. Consider using a library likeBigNumberor converting to satoshis as an integer.Suggested fix using BigNumber
+import BigNumber from 'bignumber.js'; + +// Convert string amount to satoshis (integer) +const amountSatoshis = new BigNumber(amount).times(1e8).integerValue().toNumber(); + tempTx.addOutput({ address: to, - value: parseFloat(amount), + value: amountSatoshis, });Committable suggestion skipped: line range outside the PR's diff.
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue-130-155 (1)
130-155: Validation logic may allow invalid transactions.With the balance and dust checks commented out (lines 143-153), the validation only checks address format and decimal validity. This could allow users to submit transactions with amounts exceeding their balance or below the dust threshold, which would fail at the network level.
packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue-143-154 (1)
143-154: Commented-out validation code should be resolved.There's a significant block of commented-out validation logic for dust amount and balance checks. Either uncomment and enable this validation, or remove the dead code if it's no longer needed. Leaving it commented creates ambiguity about intended behavior.
packages/extension/src/providers/bitcoin/ui/libs/signer.ts-145-155 (1)
145-155: Missing error handling for unknown addresses in keypair mapping.If
input.addressis not found inaddressKeyPairMapping, accessingkeyPair.signorkeyPair.publicKeywill throw a runtime error. Add validation to provide a clearer error message.🔎 Proposed fix
allInputs.forEach((input, i) => { const keyPair = addressKeyPairMapping[input.address]; + if (!keyPair) { + throw new Error(`No keypair found for address: ${input.address}`); + } const Signer = { sign: (hash: Uint8Array) => { return Buffer.from(keyPair.sign(hash)); }, publicKey: Buffer.from(keyPair.publicKey), } as unknown as bitcoin.Signer; tx.signInput(i, Signer); });packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/spark-send-tab.vue-228-231 (1)
228-231: Remove debug code and hardcoded fallback value before release.The
console.logstatement and the hardcoded fallback value400000000with the TODO comment are development artifacts that should be removed before merging to production.🔎 Proposed fix
if (availableAsset.value.coingeckoID) { const market = new MarketData(); const marketData = await market.getMarketData([ availableAsset.value.coingeckoID, ]); availableAsset.value.price = (marketData[0]?.current_price ?? 0).toString(); } - console.log(props.sparkAccount.sparkBalance.availableBalance); - availableAsset.value.balance = String( - props.sparkAccount.sparkBalance.availableBalance || 400000000, // TODO : remove this fallback value - ); + availableAsset.value.balance = String( + props.sparkAccount.sparkBalance.availableBalance || 0, + ); availableAsset.value.name = 'Available Balance'; };packages/extension/src/libs/spark-handler/index.ts-460-486 (1)
460-486: Remove extensive commented-out debugging code.Lines 460-486 contain a large block of commented-out code that clutters the implementation. This should be removed to improve readability.
packages/extension/src/libs/spark-handler/index.ts-315-328 (1)
315-328: Remove large block of commented-out code.Lines 323-328 contain commented-out code and TODO notes. Remove dead code or convert to proper documentation if the information is valuable.
packages/extension/src/providers/bitcoin/ui/send-transaction/index.vue-11-29 (1)
11-29: Dead code: Tabs are disabled withv-if="false".Lines 12-29 define a tab UI that is permanently disabled with
v-if="false". If this feature is not ready, consider removing this code or moving it to a feature branch. The commented condition on line 11 suggests this should be enabled for Firo networks with Spark accounts, but it's currently hardcoded to false.Should this be enabled, or should the tab code be removed until it's ready for production?
packages/extension/src/libs/spark-handler/getSparkCoinInfo.ts-118-242 (1)
118-242: Remove extensive commented-out example code.Lines 118-242 contain large blocks of commented-out example usage code. This significantly bloats the file and reduces readability. Either remove this code or move it to separate documentation/examples.
packages/extension/src/libs/spark-handler/getMintTxData.ts-45-47 (1)
45-47: Memory leak on error throw.If
mintedCoinDatacreation fails and an error is thrown (line 46), the previously allocateddecodedAddressPtr(line 29) is never freed, causing a memory leak.🔎 Recommended fix
Wrap the entire function body in a try/finally block to ensure cleanup:
+ let decodedAddressPtr; + let outputsPointerArray; + let serialContextPointer; + + try { - const decodedAddressPtr = wasmModule.ccall( + decodedAddressPtr = wasmModule.ccall( 'js_decodeAddress', 'number', ['string', 'number'], [address, Number(isTestNetwork)], ); // ... rest of the function + } finally { + if (decodedAddressPtr) wasmModule._free(decodedAddressPtr); + if (outputsPointerArray) wasmModule._free(outputsPointerArray); + if (serialContextPointer) wasmModule._free(serialContextPointer); + }Committable suggestion skipped: line range outside the PR's diff.
packages/extension/src/libs/spark-handler/getMintTxData.ts-70-72 (1)
70-72: Inconsistent error handling approach.Line 46 throws an Error, but line 71 also throws an Error. However, the cleanup code (lines 104-105) will not execute if an error is thrown, leading to memory leaks. Use a try/finally pattern to ensure cleanup.
packages/extension/src/libs/spark-handler/generateSparkWallet.ts-45-50 (1)
45-50: Memory leak in error handling path.The error handler frees
spendKeyDataObjandkeyDataPtr, but ifspendKeyDataObjcreation fails (line 29), it will be0and the subsequent free calls may reference uninitialized variables. Additionally, ifspendKeyObjcreation fails,spendKeyObjis not freed before returning.🔎 Recommended fix
} catch (e) { console.log(e); - wasm.ccall('js_freeSpendKeyData', null, ['number'], [spendKeyDataObj]); - wasm._free(keyDataPtr); + if (spendKeyDataObj && spendKeyDataObj !== 0) { + wasm.ccall('js_freeSpendKeyData', null, ['number'], [spendKeyDataObj]); + } + if (keyDataPtr) { + wasm._free(keyDataPtr); + } return 0; }packages/extension/src/libs/utils/updateAndSync/updateCoinSet.ts-155-163 (1)
155-163: Set operations on arrays won't deduplicate correctly.JavaScript
Setuses reference equality for arrays, sonew Set(coins)wherecoinsisstring[][]won't deduplicate coins with identical content. Two separate['a', 'b']arrays will be treated as different elements.Additionally,
differenceSetsreturns elements in the first set not in the second, but the intent here appears to be merging/union. Consider:-const updatedCoinsSet = differenceSets( - new Set(localSets?.[index]?.coins ?? []), - new Set(newCoins), -); -localSets[index] = { - blockHash: remoteMeta.blockHash, - setHash: remoteMeta.setHash, - coins: Array.from(updatedCoinsSet), -}; +// For full replacement, use new coins; for delta, append new coins to existing +const existingCoins = isFullReplacement ? [] : (localSets?.[index]?.coins ?? []); +const mergedCoins = [...existingCoins, ...newCoins]; +localSets[index] = { + blockHash: remoteMeta.blockHash, + setHash: remoteMeta.setHash, + coins: mergedCoins, +};Committable suggestion skipped: line range outside the PR's diff.
packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts-73-91 (1)
73-91: Retry condition is always true; risk of infinite retry loop.In the catch block,
this.mainClientis set toundefinedon line 82, so the check at line 84 (!this.mainClient) is always true. This creates an infinite retry loop with no backoff or max retry limit.Consider adding retry limits or exponential backoff:
+private retryCount = 0; +private readonly MAX_RETRIES = 5; async connectMain() { try { const peer = getRandomHardcodedPeer(); this.mainClient = new ElectrumClient(peer.host, 50004, 'wss'); await this.mainClient.connect('electrum-client-browser', '1.4'); this.wasConnectedAtLeastOnce = true; + this.retryCount = 0; } catch (e) { this.mainClient = undefined; console.error('electrum_wallet:connectMain', e); - if (!this.mainClient) { + if (this.retryCount < this.MAX_RETRIES) { + this.retryCount++; console.warn('electrum_wallet:connectMain', 'retry'); - setTimeout(() => { + const delay = Math.min(5000 * Math.pow(2, this.retryCount), 60000); + setTimeout(() => { this.connectMain(); - }, 5000); + }, delay); + } else { + console.error('electrum_wallet:connectMain', 'Max retries exceeded'); } } }Committable suggestion skipped: line range outside the PR's diff.
packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts-103-111 (1)
103-111: Methods assumemainClientis connected.Multiple methods use
this.mainClient!without checking if the client is connected. If called before connection or after disconnect, this will throw a runtime error.Consider adding a connection check helper:
+private ensureConnected(): ElectrumClient { + if (!this.mainClient) { + throw new Error('Electrum client not connected'); + } + return this.mainClient; +} async getBalanceByAddress(address: string): Promise<any> { const script = bitcoin.address.toOutputScript(address, networkInfo); const hash = bitcoin.crypto.sha256(script); const reversedHash = Buffer.from(hash.reverse()); - return await this.mainClient!.blockchain_scripthash_getBalance( + return await this.ensureConnected().blockchain_scripthash_getBalance( reversedHash.toString('hex'), ); }packages/extension/src/providers/bitcoin/libs/firo-wallet/base-firo-wallet.ts-32-34 (1)
32-34:setSecretshould await the async underlying method.According to the
FiroWalletimplementation,setSecret(secret: string)returnsPromise<void>as it persists data to storage. The current implementation doesn't await this, which could cause race conditions if callers assume the secret is persisted after the call returns.-setSecret(mnemonic: string) { - this.#wallet.setSecret(mnemonic); +async setSecret(mnemonic: string): Promise<void> { + await this.#wallet.setSecret(mnemonic); }
🟡 Minor comments (20)
packages/extension/package.json-69-70 (1)
69-70: Align ecpair versions between root and extension package.json files.The root package.json pins ecpair to the RC version
3.0.0-rc.0, while the extension uses^3.0.0(which resolves to the stable release). This mismatch can cause version resolution conflicts. Update the root package.json to use^3.0.0for consistency, or pin the extension to the same RC version if needed for compatibility.Additionally,
[email protected]uses exact pinning while other dependencies in the extension package use caret ranges (^). For consistency, either adopt a uniform pinning strategy or document whyelectrum-client-browserrequires an exact version.package.json-52-54 (1)
52-54: Upgradeecpairfrom release candidate to stable3.0.0.The npm registry has ecpair 3.0.0 available as a stable release, but the root package.json specifies
3.0.0-rc.0. The extension package imports ecpair in multiple places (firo-wallet.ts,createTempTx.ts,getTotalMintedAmount.ts) but relies on the root-level dependency rather than declaring its own.Update the root to use the stable
3.0.0release instead of the release candidate to ensure better stability and long-term support.packages/extension/src/libs/spark-handler/utils.ts-1-12 (1)
1-12: Handle potential undefined return value.Lines 8-11: If
hex.match(/.{2}/g)returnsnull(which could happen for edge cases), the function returnsundefinedinstead of astring. This violates the implicit return type.Add explicit error handling or update the return type to
string | undefined.🔎 Proposed fix
-export const numberToReversedHex = (num: number) => { +export const numberToReversedHex = (num: number): string => { let hex = num.toString(16); if (hex.length % 2 !== 0) { hex = '0' + hex; } const bytes = hex.match(/.{2}/g); + if (!bytes) { + return '00'; // or throw new Error('Invalid hex conversion') + } const reversed = bytes?.reverse(); - return reversed?.join(''); + return reversed.join(''); };packages/extension/src/libs/keyring/keyring.ts-76-78 (1)
76-78: Password validation is properly implemented via cryptographic decryption; however, verify secure memory handling in callers.The underlying
getSavedMnemonic()implementation in packages/keyring/src/index.ts (line 213-224) validates the password implicitly through thedecrypt()function—an incorrect password will fail decryption. However, the single caller in packages/extension/src/libs/background/index.ts (line 179) retrieves the mnemonic and passes it towallet.setSecret(), where it is stored in memory asthis.secretwithout visible clearing after use. Ensure the mnemonic is wiped from memory after use to prevent sensitive data leakage.packages/extension/src/libs/keyring/keyring.ts-73-75 (1)
73-75: Ensure private key exposure is justified and future callers follow secure patterns.This method exposes raw private key data. While the current usage in Spark wallet generation handles it securely (immediate WASM conversion and memory cleanup), the exposed API creates risk for misuse. If additional callers are added, enforce:
- Immediate use without intermediate storage or logging
- Explicit memory management (especially for intermediate hex string)
- Consistent cleanup on success and error paths
packages/extension/src/providers/bitcoin/ui/send-transaction/components/send-spark-address-input.vue-86-88 (1)
86-88: Initial value not validated – addimmediate: trueto the watcher.If the component mounts with a pre-filled Spark address,
isValidSparkAddressremainsfalseuntil the address changes. This causes the input to briefly display red text and hide the identicon. Addingimmediate: trueensures the initial value is validated on mount.🔎 Proposed fix
-watch(btcAddress, async () => { +watch(btcAddress, async () => { isValidSparkAddress.value = await isSparkAddress(btcAddress.value); -}); +}, { immediate: true });packages/extension/src/libs/utils/wasm-loader.ts-34-44 (1)
34-44: Avoid async Promise executor – simplify the function.The async Promise executor is an anti-pattern flagged by static analysis. If
initSpark()throws synchronously before the firstawait, the error won't be caught and the Promise won't reject properly. SinceinitSpark()already returns a Promise, the wrapper is unnecessary.🔎 Proposed fix
-async function loadWasm(): Promise<WasmModule> { - return new Promise(async (resolve, reject) => { - const wasmModule = await initSpark(); - - if (typeof wasmModule !== 'undefined') { - resolve(wasmModule); - } else { - reject(new Error('Failed to load WASM module.')); - } - }); +async function loadWasm(): Promise<WasmModule> { + const wasmModule = await initSpark(); + if (typeof wasmModule === 'undefined') { + throw new Error('Failed to load WASM module.'); + } + return wasmModule; }packages/extension/src/providers/bitcoin/libs/firo-wallet/firo-wallet.ts-228-241 (1)
228-241: Spark index initialization race condition.If two calls to
getSparkAddressAsyncoccur simultaneously before the index exists, both could read the non-existent state and both set it to 1, potentially causing address reuse issues.Consider using atomic read-modify-write or initializing the spark index during wallet setup rather than lazily.
packages/extension/src/providers/bitcoin/libs/firo-wallet/firo-wallet.ts-166-171 (1)
166-171: Silent error swallowing in UTXO fetching.Errors during UTXO fetching are logged but silently ignored, which could lead to incomplete UTXO sets being used for transaction construction without the caller being aware.
Consider throwing or returning an error indicator so callers can decide how to handle partial failures.
packages/extension/src/providers/bitcoin/libs/electrum-client/abstract-electrum.ts-40-58 (1)
40-58: Inconsistent type definition style and potential type error forblockHash.
FullTransactionModeluses a class withblockHash: number = 0, but block hashes are typically hexadecimal strings. This appears to be a type mismatch.🔎 Proposed fix
export class FullTransactionModel { address: string = ''; - blockHash: number = 0; + blockHash: string = '';packages/extension/src/providers/bitcoin/libs/firo-wallet/firo-wallet.ts-89-97 (1)
89-97: Unhandled undefined/null secret ingetSecret.If the storage get call returns an object without a
secretproperty (e.g.,{ seed: '...' }but nosecret), the error message "FiroWallet not initialize" is thrown, but the destructuring may also fail if the storage returnsnullorundefinedentirely.🔎 Proposed defensive handling
async getSecret(): Promise<Buffer> { - const { secret } = await this.#storage.get( + const data = await this.#storage.get( configs.STORAGE_KEYS.FIRO_WALLET_SECRET, ); - if (!secret) { + if (!data?.secret) { throw Error('FiroWallet not initialize'); } - return bip39.mnemonicToSeedSync(secret); + return bip39.mnemonicToSeedSync(data.secret); }packages/extension/src/libs/spark-handler/getFee.ts-4-4 (1)
4-4: Remove.tsextension from import path.Including the
.tsextension in the import is non-standard and may cause issues with certain bundlers or TypeScript configurations.🔎 Proposed fix
-import { firoElectrum } from '@/providers/bitcoin/libs/electrum-client/electrum-client.ts'; +import { firoElectrum } from '@/providers/bitcoin/libs/electrum-client/electrum-client';packages/extension/src/providers/bitcoin/libs/firo-wallet/firo-wallet.ts-265-266 (1)
265-266: Dubious integer cast using multiplication.
index = index * 1is an unusual way to cast to integer. It doesn't actually truncate floats (e.g.,1.5 * 1 = 1.5). UseMath.floor(index)orindex | 0for proper integer conversion.🔎 Proposed fix
async _getNodeAddressByIndex(node: number, index: number) { - index = index * 1; // cast to int + index = Math.floor(index);packages/extension/src/providers/bitcoin/libs/filter-ordinals.ts-49-50 (1)
49-50: Remove debug console.log before merging.This debug statement will clutter production logs and potentially expose ordinal data unnecessarily.
🔎 Proposed fix
return getAllOrdinals(address, networkName, []).then(ordinals => { - console.log({ ordinals }); - return utxos.filter(utxo => {packages/extension/src/libs/spark-handler/serializeMintContext.ts-19-20 (1)
19-20: Potential issue withtxHashLengthexceeding buffer length.If
txHashLengthis greater thantxHash.length, the slice operation will silently return a shorter buffer than expected. Consider adding validation or using the actual buffer length as a safeguard.🔎 Proposed fix
for (let i = 0; i < inputs.length; i++) { - const txHashBuffer = inputs[i].txHash.slice(0, inputs[i].txHashLength); + const txHash = inputs[i].txHash; + const txHashLength = Math.min(inputs[i].txHashLength, txHash.length); + const txHashBuffer = txHash.slice(0, txHashLength);packages/extension/src/libs/spark-handler/getTotalMintedAmount.ts-6-8 (1)
6-8: Remove unusedkeyPairfield fromSpendableUtxotype definition.The
keyPair: ECPairInterfacefield is included in theSpendableUtxotype but is never accessed within thegetTotalMintedAmountfunction. The function only usesutxo.txid,utxo.vout, andutxo.satoshis. Remove the unnecessary field to clean up the type definition.packages/extension/src/providers/bitcoin/ui/send-transaction/verify-transaction/index.vue-231-235 (1)
231-235: Avoid exposing raw error objects to UI.Line 234 uses
JSON.stringify(error)to display error messages. This can expose sensitive implementation details, stack traces, or internal error structures to users. Consider extracting just the error message.🔎 Proposed fix
} catch (error) { isProcessing.value = false; console.error('error', error); - errorMsg.value = JSON.stringify(error); + errorMsg.value = error instanceof Error ? error.message : 'An unexpected error occurred'; }packages/extension/src/libs/spark-handler/generateSparkWallet.ts-79-81 (1)
79-81: Swallows error details in catch block.The catch block on line 79 returns zero values without logging or re-throwing the error. This makes debugging difficult and hides failures silently.
🔎 Proposed fix
} catch (error) { + console.error('Failed to create incoming view key:', error); return { incomingViewKeyObj: 0, fullViewKeyObj: 0 }; }packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue-298-307 (1)
298-307: UseparseFloatinstead ofparseIntfor amount check.Using
parseInttruncates decimal values, so amounts like0.5would evaluate to0and the button would just show "Send" instead of "Send 0.5 FIRO".const sendButtonTitle = computed(() => { let title = 'Send'; - if (parseInt(sendAmount.value) > 0) + if (parseFloat(sendAmount.value) > 0) title = 'Send ' + formatFloatingPointValue(sendAmount.value).value + ' ' + selectedAsset.value?.symbol!.toUpperCase(); return title; });packages/extension/src/providers/bitcoin/ui/send-transaction/tabs/transparent-send-tab.vue-121-125 (1)
121-125: Fragile address type detection for send action selection.The logic
!isAddress(addressTo, network.networkInfo) ? sendSparkAction : sendActionassumes that if an address isn't a regular Bitcoin address, it must be a Spark address. However, if the user enters an invalid address (neither regular nor Spark), this would incorrectly route tosendSparkAction. Consider explicitly checking for Spark address::click=" - !isAddress(addressTo, network.networkInfo) - ? sendSparkAction - : sendAction + isSparkAddress(addressTo) + ? sendSparkAction + : sendAction "Note: The
isInputsValidcomputed property already validates the address, so this may only affect edge cases where validation is bypassed.Committable suggestion skipped: line range outside the PR's diff.
packages/extension/src/libs/spark-handler/createTempFromSparkTx.ts
Outdated
Show resolved
Hide resolved
| export async function sendFromSparkAddress( | ||
| network: bitcoin.Network, | ||
| to: string, | ||
| amount: string, | ||
| subtractFee = false, | ||
| ): Promise<string | undefined> { | ||
| const diversifier = 1n; | ||
| const db = new IndexedDBHelper(); | ||
| const Module = await wasmInstance.getInstance(); | ||
| const spendKeyObj = await getSpendKeyObj(Module); | ||
| const isSparkTransaction = await isSparkAddress(to); | ||
|
|
||
| const ownedCoins = ((await db.readData('myCoins')) || []) as OwnedCoinData[]; | ||
|
|
||
| const usedCoinsTags = await db.readData<{ tags: string[] }>( | ||
| DB_DATA_KEYS.usedCoinsTags, | ||
| ); | ||
| const coinsTagsSet = new Set(usedCoinsTags.tags); | ||
| const myCoinsTagsSet = new Set((ownedCoins ?? []).map(coin => coin.tag)); | ||
|
|
||
| const usedMyCoinsTagsSet = intersectSets(coinsTagsSet, myCoinsTagsSet); | ||
| const revalidatedCoins = ownedCoins.filter(ownedCoin => { | ||
| return !usedMyCoinsTagsSet.has(ownedCoin.tag); | ||
| }); | ||
|
|
||
| const spendCoinList: { | ||
| coin: string[]; | ||
| setId: number; | ||
| metadata: number; | ||
| deserializedCoinObj: number; | ||
| }[] = []; | ||
|
|
||
| const { incomingViewKeyObj, fullViewKeyObj } = await getIncomingViewKey( | ||
| Module, | ||
| spendKeyObj, | ||
| ); | ||
|
|
||
| const addressObj = Module.ccall( | ||
| 'js_getAddress', | ||
| 'number', | ||
| ['number', 'number'], | ||
| [incomingViewKeyObj, diversifier], | ||
| ); | ||
|
|
||
| // Create recipients vector for spend transaction | ||
| const recipientsVector = Module.ccall( | ||
| 'js_createRecipientsVectorForCreateSparkSpendTransaction', | ||
| 'number', | ||
| ['number'], | ||
| [1], // intended final size | ||
| ); | ||
|
|
||
| if (!isSparkTransaction) { | ||
| Module.ccall( | ||
| 'js_addRecipientForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number', 'number', 'number'], | ||
| [recipientsVector, BigInt(+amount * 10 ** 8), 0], | ||
| ); | ||
| } | ||
|
|
||
| const privateRecipientsVector = Module.ccall( | ||
| 'js_createPrivateRecipientsVectorForCreateSparkSpendTransaction', | ||
| 'number', | ||
| ['number'], | ||
| [1], // intended final size | ||
| ); | ||
|
|
||
| Module.ccall( | ||
| 'js_addPrivateRecipientForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number', 'number', 'number', 'string', 'number'], | ||
| [ | ||
| privateRecipientsVector, | ||
| addressObj, | ||
| BigInt(+amount * 10 ** 8), | ||
| 'Private memo', | ||
| 1, | ||
| ], | ||
| ); | ||
|
|
||
| const coinsList = Module.ccall( | ||
| 'js_createCoinsListForCreateSparkSpendTransaction', | ||
| 'number', | ||
| [], | ||
| [], | ||
| ); | ||
|
|
||
| const coinMetaPromiseList: Promise<void>[] = []; | ||
|
|
||
| revalidatedCoins.forEach(ownedCoin => { | ||
| const myCoinMetaData = getSparkCoinInfo({ | ||
| coin: ownedCoin.coin, | ||
| fullViewKeyObj, | ||
| incomingViewKeyObj, | ||
| wasmModule: Module, | ||
| keepMemory: true, | ||
| }) | ||
| .then(data => { | ||
| if (!data.isUsed) { | ||
| spendCoinList.push({ | ||
| coin: ownedCoin.coin, | ||
| setId: ownedCoin.setId, | ||
| metadata: data.metaData, | ||
| deserializedCoinObj: data.deserializedCoinObj, | ||
| }); | ||
| } else { | ||
| } | ||
| }) | ||
| .catch(err => { | ||
| console.error('Error getting spark coin info', err); | ||
| }); | ||
| coinMetaPromiseList.push(myCoinMetaData); | ||
| }); | ||
|
|
||
| await Promise.allSettled(coinMetaPromiseList); | ||
|
|
||
| spendCoinList.forEach(spendCoin => { | ||
| Module.ccall( | ||
| 'js_setCSparkMintMetaId', // C++ function name | ||
| null, // Return type | ||
| ['number', 'number'], // Argument types | ||
| [spendCoin.metadata, spendCoin.setId], | ||
| ); | ||
| }); | ||
|
|
||
| try { | ||
| spendCoinList.forEach(spendCoin => { | ||
| Module.ccall( | ||
| 'js_addCoinToListForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number', 'number'], | ||
| [coinsList, spendCoin.metadata], | ||
| ); | ||
| }); | ||
| } catch (error) { | ||
| console.error('Error adding coins to list:', error); | ||
| } | ||
| const coverSetDataMap = Module.ccall( | ||
| 'js_createCoverSetDataMapForCreateSparkSpendTransaction', | ||
| 'number', | ||
| [], | ||
| [], | ||
| ); | ||
|
|
||
| const groupedBySet = spendCoinList.reduce( | ||
| (acc, coin) => { | ||
| if (!acc[coin.setId]) { | ||
| acc[coin.setId] = []; | ||
| } | ||
| acc[coin.setId].push(coin); | ||
| return acc; | ||
| }, | ||
| {} as Record<number, typeof spendCoinList>, | ||
| ); | ||
|
|
||
| console.debug('groupedBySet', groupedBySet); | ||
|
|
||
| const deserializedCoinList: Record<string, number[]> = {}; | ||
| // TODO: move to separate function | ||
|
|
||
| for (const set in groupedBySet) { | ||
| const fullSet = await db.getSetById(Number(set)); | ||
| fullSet?.coins.forEach(coin => { | ||
| const serializedCoin = getSerializedCoin( | ||
| coin[0] as string, | ||
| ) as unknown as ArrayLike<number>; | ||
|
|
||
| const serializedCoinPointer = Module._malloc(serializedCoin.length); | ||
| Module.HEAPU8.set(serializedCoin, serializedCoinPointer); | ||
|
|
||
| const serialContext = getSerializedCoin( | ||
| coin[2] as string, | ||
| ) as unknown as ArrayLike<number>; | ||
|
|
||
| const serialContextPointer = Module._malloc(serializedCoin.length); | ||
| Module.HEAPU8.set(serialContext, serialContextPointer); | ||
|
|
||
| const deserializedCoinObj = Module.ccall( | ||
| 'js_deserializeCoin', | ||
| 'number', | ||
| ['number', 'number', 'number', 'number'], | ||
| [ | ||
| serializedCoinPointer, | ||
| serializedCoin.length, | ||
| serialContextPointer, | ||
| serialContext.length, | ||
| ], | ||
| ); | ||
|
|
||
| if (!deserializedCoinList[set]) { | ||
| deserializedCoinList[set] = []; | ||
| } | ||
| deserializedCoinList[set].push(deserializedCoinObj); | ||
| }); | ||
| } | ||
|
|
||
| console.debug('deserializedCoinList =:=:=:=', deserializedCoinList); | ||
|
|
||
| const setHashList = await db.getSetHashes(); | ||
|
|
||
| for (const setId in groupedBySet) { | ||
| const coverSetRepresentation = Buffer.from( | ||
| setHashList[+setId - 1], | ||
| 'base64', | ||
| ); | ||
| console.debug('coverSetRepresentation :=>', coverSetRepresentation); | ||
| const coverSetRepresentationPointer = Module._malloc( | ||
| coverSetRepresentation.length, | ||
| ); | ||
| Module.HEAPU8.set(coverSetRepresentation, coverSetRepresentationPointer); | ||
|
|
||
| const coverSetData = Module.ccall( | ||
| 'js_createCoverSetData', | ||
| 'number', | ||
| ['number', 'number'], | ||
| [coverSetRepresentationPointer, coverSetRepresentation.length], | ||
| ); | ||
|
|
||
| console.debug(groupedBySet, setId); | ||
| deserializedCoinList[setId].forEach(deserializedCoin => { | ||
| Module.ccall( | ||
| 'js_addCoinToCoverSetData', | ||
| null, | ||
| ['number', 'number'], | ||
| [coverSetData, deserializedCoin], | ||
| ); | ||
| // console.count('Deserialized coin added'); | ||
| }); | ||
|
|
||
| // Add cover set data to map (with group ID 1) | ||
| Module.ccall( | ||
| 'js_addCoverSetDataForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number', 'number', 'number'], | ||
| [coverSetDataMap, BigInt(setId), coverSetData], | ||
| ); | ||
| } | ||
|
|
||
| const idAndBlockHashesMap = Module.ccall( | ||
| 'js_createIdAndBlockHashesMapForCreateSparkSpendTransaction', | ||
| 'number', | ||
| [], | ||
| [], | ||
| ); | ||
|
|
||
| const blockHashList = await db.getBlockHashes(); | ||
| console.log('blockHashList =>>>', blockHashList); | ||
| for (const setId in groupedBySet) { | ||
| console.log(BigInt(setId), base64ToHex(blockHashList[+setId - 1])); | ||
| Module.ccall( | ||
| 'js_addIdAndBlockHashForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number', 'number', 'string'], | ||
| [ | ||
| idAndBlockHashesMap, | ||
| BigInt(setId), | ||
| base64ToReversedHex(blockHashList[+setId - 1]), | ||
| ], | ||
| ); | ||
| } | ||
|
|
||
| const tempTx = new bitcoin.Psbt({ network: network.networkInfo }); | ||
| tempTx.setVersion(3 | (SPARK_TX_TYPE << 16)); // version 3 and tx type in high bits (3 | (SPARK_TX_TYPE << 16)); | ||
| tempTx.setLocktime(LOCK_TIME); // new Date().getTime() / 1000 | ||
|
|
||
| tempTx.addInput({ | ||
| hash: '0000000000000000000000000000000000000000000000000000000000000000', | ||
| index: 4294967295, | ||
| sequence: 4294967295, | ||
| finalScriptSig: Buffer.from('d3', 'hex'), | ||
| }); | ||
|
|
||
| console.log( | ||
| 'tempTx.data.globalMap', | ||
| tempTx.data.globalMap, | ||
| 'tempTx.data.inputs', | ||
| tempTx.data.inputs, | ||
| 'tempTx', | ||
| tempTx, | ||
| ); | ||
|
|
||
| const tempTxBuffer = tempTx.extractTransaction(true).toBuffer(); | ||
| const extendedTempTxBuffer = Buffer.concat([ | ||
| tempTxBuffer, | ||
| Buffer.from([0x00]), | ||
| ]); | ||
|
|
||
| console.log('buffer for temp', extendedTempTxBuffer.toHex()); | ||
|
|
||
| const txHash = bitcoin.crypto.hash256(extendedTempTxBuffer); | ||
| const txHashSig = txHash.reverse().toString('hex'); | ||
|
|
||
| // TODO: check not spark case | ||
| if (!isSparkTransaction) { | ||
| tempTx.addOutput({ | ||
| address: to, | ||
| value: parseFloat(amount), | ||
| }); | ||
| } | ||
|
|
||
| //tempTx// tx.signInput(0, spendKeyObj); // ? how to sign? Is I need to sign wit all utxo keypairs? // ? how to add subtractFeeFromAmount? // ? what is spend transaction type? // https://github.com/firoorg/sparkmobile/blob/main/include/spark.h#L22 | ||
| // tx.finalizeAllInputs(); | ||
| // const txHash = tx.extractTransaction() | ||
|
|
||
| // const txHashSig = txHash.getHash() | ||
|
|
||
| console.log('coinlist', coinsList); | ||
| const additionalTxSize = 0; | ||
|
|
||
| // Create the spend transaction | ||
| const result = Module.ccall( | ||
| 'js_createSparkSpendTransaction', | ||
| 'number', | ||
| [ | ||
| 'number', | ||
| 'number', | ||
| 'number', | ||
| 'number', | ||
| 'number', | ||
| 'number', | ||
| 'number', | ||
| 'number', | ||
| 'string', | ||
| 'number', | ||
| ], | ||
| [ | ||
| spendKeyObj, | ||
| fullViewKeyObj, | ||
| incomingViewKeyObj, | ||
| recipientsVector, | ||
| privateRecipientsVector, | ||
| coinsList, | ||
| coverSetDataMap, | ||
| idAndBlockHashesMap, | ||
| txHashSig, | ||
| additionalTxSize, | ||
| ], | ||
| ); | ||
|
|
||
| console.log('final result is', result); | ||
|
|
||
| if (result) { | ||
| // Get transaction details | ||
|
|
||
| const serializedSpend = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultSerializedSpend', | ||
| 'number', // returns a pointer to the beginning of a byte array | ||
| ['number'], | ||
| [result], | ||
| ); | ||
|
|
||
| console.log('serializedSpend ==> ==>', serializedSpend); | ||
|
|
||
| const serializedSpendSize = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultSerializedSpendSize', | ||
| 'number', | ||
| ['number'], | ||
| [result], | ||
| ); | ||
|
|
||
| console.log(`Serialized spend size: `, serializedSpendSize); | ||
|
|
||
| const serializedSpendBytes = new Uint8Array( | ||
| Module.HEAPU8.buffer, | ||
| serializedSpend, | ||
| serializedSpendSize, | ||
| ); | ||
|
|
||
| // Make a copy (optional, because the above is just a view into WASM memory) | ||
| const spendDataCopy = new Uint8Array(serializedSpendBytes); | ||
|
|
||
| // If you need hex | ||
| const hex = Array.from(spendDataCopy) | ||
| .map(b => b.toString(16).padStart(2, '0')) | ||
| .join(''); | ||
|
|
||
| console.log('Serialized Spend (hex):', hex); | ||
|
|
||
| const outputScriptsSize = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultOutputScriptsSize', | ||
| 'number', | ||
| ['number'], | ||
| [result], | ||
| ); | ||
| const scripts = []; | ||
|
|
||
| // Get each output script | ||
| for (let i = 0; i < outputScriptsSize; i++) { | ||
| const scriptPtr = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultOutputScriptAt', | ||
| 'number', // returns a pointer to the beginning of a byte array | ||
| ['number', 'number'], | ||
| [result, i], | ||
| ); | ||
|
|
||
| console.log(`Output script in for:`, scriptPtr); | ||
| const scriptSize = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultOutputScriptSizeAt', | ||
| 'number', | ||
| ['number', 'number'], | ||
| [result, i], | ||
| ); | ||
| console.log(`Output script ${i} size: ${scriptSize}`); | ||
|
|
||
| const script = new Uint8Array( | ||
| Module.HEAPU8.buffer, | ||
| scriptPtr, | ||
| scriptSize, | ||
| ); | ||
|
|
||
| scripts.push(script); | ||
| } | ||
|
|
||
| // Get spent coins information | ||
| const spentCoinsSize = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultSpentCoinsSize', | ||
| 'number', | ||
| ['number'], | ||
| [result], | ||
| ); | ||
|
|
||
| console.log(`Spent coins size: ${spentCoinsSize}`); | ||
|
|
||
| for (let i = 0; i < spentCoinsSize; i++) { | ||
| const spentCoinMeta = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultSpentCoinAt', | ||
| 'number', | ||
| ['number', 'number'], | ||
| [result, i], | ||
| ); | ||
| const spentCoinValue = Module.ccall( | ||
| 'js_getCSparkMintMetaValue', | ||
| 'number', | ||
| ['number'], | ||
| [spentCoinMeta], | ||
| ); | ||
|
|
||
| // const coinPtr = Module.ccall( | ||
| // '_js_getCoinFromMeta', | ||
| // 'number', | ||
| // ['number', 'number'], | ||
| // [spentCoinMeta, incomingViewKeyObj], | ||
| // ); | ||
| // | ||
| // const hash = Module.ccall( | ||
| // '_js_getCoinHash', | ||
| // 'number', | ||
| // ['number'], | ||
| // [coinPtr], | ||
| // ); | ||
| // | ||
| // console.log( | ||
| // `coin hash =>>>>>>======>>>>><<<<<======<<<<<<<======: ${hash}`, | ||
| // ); | ||
| const spentCoinMetaDeserialized = new Uint8Array( | ||
| Module.HEAPU8.buffer, | ||
| spentCoinMeta, | ||
| spentCoinMeta.length, | ||
| ); | ||
|
|
||
| console.log( | ||
| `spend coins meta @nd value ${spentCoinValue}, ${spentCoinMeta} ${spentCoinMetaDeserialized.toString()}`, | ||
| ); | ||
| } | ||
|
|
||
| // TODO: check if this is needed | ||
| // Get transaction fee | ||
| const fee = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultFee', | ||
| 'number', | ||
| ['number'], | ||
| [result], | ||
| ); | ||
|
|
||
| const psbt = new bitcoin.Psbt({ network: network.networkInfo }); | ||
|
|
||
| // const api = (await network.api()) as unknown as FiroAPI; | ||
|
|
||
| psbt.addInput({ | ||
| hash: '0000000000000000000000000000000000000000000000000000000000000000', | ||
| index: 4294967295, | ||
| sequence: 4294967295, | ||
| finalScriptSig: Buffer.from('d3', 'hex'), | ||
| }); | ||
|
|
||
| psbt.setLocktime(LOCK_TIME); | ||
|
|
||
| psbt.setVersion(3 | (SPARK_TX_TYPE << 16)); | ||
| scripts.forEach(script => { | ||
| console.log('script is ==>', script); | ||
| psbt.addOutput({ | ||
| script: Buffer.from(script), | ||
| value: 0, | ||
| }); | ||
| }); | ||
|
|
||
| const rawTx = psbt.extractTransaction(); | ||
| const txHex = rawTx.toHex(); | ||
| const sizeHex = numberToReversedHex(serializedSpendSize); | ||
| const finalTx = txHex + 'fd' + sizeHex + hex; | ||
|
|
||
| console.log('Final TX to broadcast:', finalTx); | ||
|
|
||
| return finalTx; | ||
|
|
||
| // TODO: free memory | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: Missing WASM memory cleanup.
The function allocates numerous WASM objects throughout execution (spendKeyObj, fullViewKeyObj, incomingViewKeyObj, addressObj, recipientsVector, privateRecipientsVector, coinsList, coverSetDataMap, idAndBlockHashesMap, coverSetData, deserialized coins, etc.) but never frees them. This will cause severe memory leaks in the WASM heap.
Add a finally block to ensure all WASM-allocated resources are properly freed, regardless of success or failure.
🤖 Prompt for AI Agents
packages/extension/src/libs/spark-handler/index.ts lines 22-530: the function
allocates many WASM objects and malloc'ed buffers but never releases them,
causing memory leaks; add a try/finally around the main logic and in finally
free every WASM object and malloc pointer you create — at minimum call the
appropriate Module.ccall free functions (or Module._free for raw pointers) for:
spendKeyObj, fullViewKeyObj, incomingViewKeyObj, addressObj, recipientsVector,
privateRecipientsVector, coinsList, coverSetDataMap, idAndBlockHashesMap, any
coverSetData created in the loop, every deserializedCoinObj pushed into
deserializedCoinList, and the Module result (e.g.
js_freeCreateSparkSpendTxResult if available); also free all malloc pointers
such as serializedCoinPointer, serialContextPointer,
coverSetRepresentationPointer, and any other Module._malloc allocations; ensure
you guard frees with existence checks and catch/ignore freeing errors so cleanup
always runs regardless of earlier errors.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added freeing memory executions
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Fetch the current state of the spark-handler index.ts file to verify memory cleanup
cat packages/extension/src/libs/spark-handler/index.ts
Length of output: 14517
@narekpetrosyan, thank you for addressing some of the memory leaks! I can see you've added cleanup for several malloc pointers (serializedCoinPointer, serialContextPointer, coverSetRepresentationPointer, serializedSpend, scriptPtr, and spentCoinMeta). This is good progress.
However, the memory leak issue is not fully resolved yet. There are still critical problems:
🔴 Critical: Missing try/finally block
The most important issue is that there's no try/finally block. Without it, if any error occurs or the function returns early (like when result is falsy), none of the cleanup code will execute. This means memory will leak in error cases.
🔴 Major WASM objects still not freed
Many high-level WASM objects are still never freed:
Key objects created but never freed:
spendKeyObj(line 27)incomingViewKeyObjandfullViewKeyObj(lines 51-53)addressObj(line 55)recipientsVector(line 62)privateRecipientsVector(line 72)coinsList(line 95)coverSetDataMap(line 150)- All
coverSetDatainstances created in the loop (line 223) idAndBlockHashesMap(line 248)resultfromjs_createSparkSpendTransaction(line 310)- All
deserializedCoinObjinstances indeserializedCoinList(never freed after being stored) - All
metadataobjects inspendCoinList(line 121)
Recommended structure
export async function sendFromSparkAddress(...): Promise<string | undefined> {
// Track all allocated resources
let spendKeyObj = 0, fullViewKeyObj = 0, incomingViewKeyObj = 0;
let addressObj = 0, recipientsVector = 0, privateRecipientsVector = 0;
// ... etc
try {
// All existing logic here
} finally {
// Free everything, with existence checks
try {
if (spendKeyObj) Module.ccall('js_freeSpendKey', null, ['number'], [spendKeyObj]);
if (fullViewKeyObj) Module.ccall('js_freeFullViewKey', null, ['number'], [fullViewKeyObj]);
// ... free all other objects
// Free all deserializedCoinObj instances
Object.values(deserializedCoinList).flat().forEach(obj => {
if (obj) Module._free(obj);
});
} catch (e) {
console.error('Cleanup error:', e);
}
}
}This ensures cleanup happens regardless of success or failure, preventing memory leaks in production.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All needed clean ups are completed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Fetch the current state of the spark-handler index.ts file to verify cleanup implementation
cat packages/extension/src/libs/spark-handler/index.ts
Length of output: 16290
@narekpetrosyan, thank you for adding the try/finally structure and several cleanup operations! This is good progress. However, the memory cleanup is still incomplete, and there's also a critical bug in the finally block.
🔴 Critical Bug in Finally Block
Line 514 has a copy-paste error:
if (incomingViewKeyObj && incomingViewKeyObj !== 0) {
Module.ccall(
'js_freeIncomingViewKey',
null,
['number'],
[fullViewKeyObj], // ❌ WRONG! Should be incomingViewKeyObj
);
}This frees fullViewKeyObj twice (once on line 508, again on line 514) but never frees incomingViewKeyObj.
🔴 Major WASM Objects Still Not Freed
The following objects are created but never freed in the finally block:
coinsList(line 105) - created viajs_createCoinsListForCreateSparkSpendTransactioncoverSetDataMap(line 169) - created viajs_createCoverSetDataMapForCreateSparkSpendTransactionidAndBlockHashesMap(line 274) - created viajs_createIdAndBlockHashesMapForCreateSparkSpendTransactionresult(line 335) - created viajs_createSparkSpendTransaction, likely needsjs_freeCreateSparkSpendTxResult- All
deserializedCoinObjinstances stored indeserializedCoinList(line 210) - these accumulate in the object but are never freed - All
metadataobjects inspendCoinList(line 134) - fromgetSparkCoinInfo, these metadata handles are never freed - All
coverSetDatainstances created in the loop (line 235) - one per setId, never freed
These all need to be tracked and freed in the finally block. Consider:
let coinsList = 0;
let coverSetDataMap = 0;
let idAndBlockHashesMap = 0;
let result = 0;
const coverSetDataList: number[] = []; // track all coverSetData
try {
// ... existing code ...
const coverSetData = Module.ccall(...);
coverSetDataList.push(coverSetData); // track it
} finally {
// ... existing frees ...
if (coinsList) Module.ccall('js_freeCoinsList', null, ['number'], [coinsList]);
if (coverSetDataMap) Module.ccall('js_freeCoverSetDataMap', null, ['number'], [coverSetDataMap]);
if (idAndBlockHashesMap) Module.ccall('js_freeIdAndBlockHashesMap', null, ['number'], [idAndBlockHashesMap]);
if (result) Module.ccall('js_freeCreateSparkSpendTxResult', null, ['number'], [result]);
// Free all deserializedCoinObj
Object.values(deserializedCoinList).flat().forEach(obj => {
if (obj) Module._free(obj);
});
// Free all coverSetData
coverSetDataList.forEach(data => {
if (data) Module.ccall('js_freeCoverSetData', null, ['number'], [data]);
});
// Free all metadata in spendCoinList
spendCoinList.forEach(item => {
if (item.metadata) Module._free(item.metadata);
});
}Please fix the bug on line 514 and add cleanup for all the missing objects.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changes applied
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Fetch the current state of the spark-handler index.ts file to verify the cleanup implementation
cat packages/extension/src/libs/spark-handler/index.ts
Length of output: 15788
@narekpetrosyan, thank you for fixing the line 514 bug and adding several cleanup operations! The bug where incomingViewKeyObj was being checked but fullViewKeyObj was freed twice has been resolved. However, there are still critical memory leak issues that need to be addressed.
🔴 Critical Issue #1: No try/finally block
All cleanup code is placed at the end of the function (lines 503-542), but not within a finally block. This means:
- If any error occurs anywhere in the function, the cleanup code will never execute
- Memory will leak whenever an error is thrown
Why this matters: Looking at your code, there are many operations that can throw errors:
- Database operations (
db.readData,db.getSetById, etc.) - WASM calls that might fail
- The explicit try/catch block around line 161 shows errors can happen
Required fix: Wrap the main logic in try/finally:
export async function sendFromSparkAddress(...): Promise<string | undefined> {
let spendKeyObj = 0;
// ... declare all other tracked variables ...
try {
// ALL existing logic here (lines 44-487)
return finalTx;
} finally {
// ALL cleanup code here (your current lines 503-542)
}
}🔴 Critical Issue #2: deserializedCoinList objects not properly freed
Line 534: if (deserializedCoinObj) { Module._free(deserializedCoinObj); }
This only frees the last deserializedCoinObj that was assigned. But looking at line 210, deserializedCoinObj is reassigned many times in the loop, and each instance is pushed to deserializedCoinList[set]. All of these need to be freed.
Required fix:
// Replace line 534 with:
Object.values(deserializedCoinList).flat().forEach(obj => {
if (obj) Module._free(obj);
});🔴 Critical Issue #3: coverSetData objects not properly freed
Line 539: if (coverSetData && coverSetData !== 0) { Module.ccall('js_freeCoverSetData', null, ['number'], [coverSetData]); }
Similar issue: coverSetData is created multiple times in the loop starting at line 230 (one per setId). Only the last instance gets freed.
Required fix: Track all coverSetData instances:
// At the top with other declarations:
const coverSetDataList: number[] = [];
// In the loop at line 235, after creating coverSetData:
coverSetData = Module.ccall('js_createCoverSetData', ...);
coverSetDataList.push(coverSetData); // Track it
// In finally block, replace line 539 with:
coverSetDataList.forEach(data => {
if (data) Module.ccall('js_freeCoverSetData', null, ['number'], [data]);
});🔴 Critical Issue #4: Missing malloc pointer cleanup
These malloc'd pointers are never freed:
- Line 202:
serializedCoinPointer - Line 207:
serialContextPointer - Line 233:
coverSetRepresentationPointer
Required fix: Free them immediately after use:
// After line 220 (after the deserializedCoinObj is created):
Module._free(serializedCoinPointer);
Module._free(serialContextPointer);
// After line 235 (after coverSetData is created):
Module._free(coverSetRepresentationPointer);🔴 Critical Issue #5: Missing spendCoinList metadata cleanup
Each item in spendCoinList has a metadata property (line 144) that's never freed.
Required fix: Add to finally block:
spendCoinList.forEach(item => {
if (item.metadata) Module.ccall('js_freeMetadata', null, ['number'], [item.metadata]);
if (item.deserializedCoinObj) Module._free(item.deserializedCoinObj);
});Summary of remaining work:
- Add try/finally structure
- Fix deserializedCoinList cleanup (free all, not just last)
- Fix coverSetData cleanup (free all, not just last)
- Add malloc pointer cleanup (serializedCoinPointer, serialContextPointer, coverSetRepresentationPointer)
- Add spendCoinList metadata cleanup
These fixes are essential to prevent memory leaks in production.
packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts
Show resolved
Hide resolved
packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (1)
packages/extension/src/libs/spark-handler/index.ts (1)
22-525: Critical: WASM memory leak remains unaddressed.As noted in previous reviews, this function allocates numerous WASM objects and malloc'd buffers without freeing them. The past review comment specifically identified this as critical and mentioned a TODO on line 528 about freeing memory.
Add a
try/finallyblock to ensure all WASM resources are properly freed:spendKeyObj,fullViewKeyObj,incomingViewKeyObj,addressObj,recipientsVector,privateRecipientsVector,coinsList,coverSetDataMap,idAndBlockHashesMap,coverSetDatainstances,deserializedCoinObjinstances, malloc'd pointers (serializedCoinPointer,serialContextPointer,coverSetRepresentationPointer), and theresultobject.
🧹 Nitpick comments (13)
packages/extension/src/libs/spark-handler/createTempFromSparkTx.ts (3)
11-11: Address the TODO comment.The TODO questions whether the entire
tempTxconstruction is necessary if onlytxHashSigis used. This suggests the implementation may be incomplete or could be simplified.Do you want me to help clarify the requirements or generate a simplified implementation if the PSBT construction is unnecessary?
18-18: Unused variable:isSparkTransactionis never referenced.The result of
isSparkAddress(to)is stored but never used in the function body. Either remove the check or implement conditional logic based on its value.🔎 Proposed fix to remove unused variable
}: CreateTempFromSparkTxArgs): Promise<string> => { - const isSparkTransaction = await isSparkAddress(to); - const tempTx = new bitcoin.Psbt({ network });
8-8: Unused parameter:amountis never referenced.The
amountparameter is declared in the interface but never used in the function implementation. Consider removing it if not needed, or implement the intended logic if it's incomplete.packages/extension/src/libs/spark-handler/index.ts (7)
127-128: Remove emptyelseblock.The empty
elseblock on line 127-128 serves no purpose and can be removed for cleaner code.🔎 Proposed fix
if (!data.isUsed) { spendCoinList.push({ coin: ownedCoin.coin, setId: ownedCoin.setId, metadata: data.metaData, deserializedCoinObj: data.deserializedCoinObj, }); - } else { - } + } })
180-180: Address the TODO comment.The TODO on line 180 suggests moving the coin deserialization logic (lines 182-216) to a separate function. This would improve readability and maintainability.
Do you want me to generate a refactored version that extracts this logic into a dedicated helper function?
259-259: Remove unused variableoffset.The
offsetvariable is declared and incremented but never used. This appears to be leftover from fixing the index mapping issue in a previous review.🔎 Proposed fix
const chunks = splitIntoChunks(txids, batchsize); - let offset = 0; for (const chunk of chunks) { const res = await Promise.all( chunk.map(el => this.mainClient?.blockchain_transaction_get(el, verbose), ), ); res.forEach((el: string, index: number) => { ret[chunk[index]] = el; }); - offset += chunk.length; }
320-324: Remove commented-out code.Lines 320-324 contain commented-out code related to transaction signing and finalization. Remove this dead code or implement the logic if it's needed.
456-473: Remove commented-out code.Lines 456-473 contain commented-out code for retrieving coin hash from metadata. Remove this dead code or implement the logic if it's needed.
114-521: Consider removing debug console.log statements.The function contains numerous
console.logandconsole.debugstatements (lines 114, 177, 218, 227, 240, 268, 270, 294-301, 326, 359, 371, 380, 396, 415, 422, 441, 480, 509, 521). Consider removing these or replacing them with proper logging infrastructure for production code.
22-525: Consider refactoring this 500+ line function.The
sendFromSparkAddressfunction is very long and handles multiple responsibilities (coin filtering, WASM interop, transaction building, result extraction). Consider breaking it into smaller, focused functions for better maintainability and testability.Suggested extraction candidates:
- Coin filtering and revalidation (lines 33-44)
- Recipient vector creation (lines 66-100)
- Coins list population (lines 109-158)
- Cover set data creation (lines 159-258)
- Transaction building (lines 283-318)
- Result extraction and processing (lines 361-523)
packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts (3)
259-269: Remove unused variableoffset.The
offsetvariable is declared on line 259 and incremented on line 269 but never actually used. This appears to be leftover from the fix to the index mapping issue in a previous review (wherechunk[index]replacedtxids[index]).🔎 Proposed fix
const chunks = splitIntoChunks(txids, batchsize); - let offset = 0; for (const chunk of chunks) { const res = await Promise.all( chunk.map(el => this.mainClient?.blockchain_transaction_get(el, verbose), ), ); res.forEach((el: string, index: number) => { ret[chunk[index]] = el; }); - offset += chunk.length; }
353-353: Fix typo:filterred→filtered.Line 353 contains a typo in the variable name.
🔎 Proposed fix
- const filterred = listUnspent.filter(el => el.length > 0).flat(); + const filtered = listUnspent.filter(el => el.length > 0).flat(); const rawTxs = await Promise.all( - filterred.map(({ txid }) => this.getTxRaw(txid, false)), + filtered.map(({ txid }) => this.getTxRaw(txid, false)), ); - return filterred.map((el, index) => ({ + return filtered.map((el, index) => ({ ...el, raw: rawTxs[index], }));
84-89: Infinite retry loop risk in connection error handling.If
this.mainClientremains falsy, the retry logic on lines 86-88 will continue indefinitely every 5 seconds. Consider adding a maximum retry count or exponential backoff to prevent resource exhaustion.Suggested improvement with retry limit
Add a retry counter and maximum attempts:
private retryCount = 0; private readonly MAX_RETRIES = 10; async connectMain() { try { const peer = getRandomHardcodedPeer(); this.mainClient = new ElectrumClient(peer.host, 50004, 'wss'); await this.mainClient.connect('electrum-client-browser', '1.4'); this.wasConnectedAtLeastOnce = true; this.retryCount = 0; // Reset on success } catch (e) { this.mainClient = undefined; console.error('electrum_wallet:connectMain', e); if (this.retryCount < this.MAX_RETRIES) { this.retryCount++; console.warn(`electrum_wallet:connectMain retry ${this.retryCount}/${this.MAX_RETRIES}`); setTimeout(() => { this.connectMain(); }, 5000 * this.retryCount); // Exponential backoff } else { console.error('electrum_wallet:connectMain max retries exceeded'); } } }
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
packages/extension/src/libs/spark-handler/createTempFromSparkTx.tspackages/extension/src/libs/spark-handler/index.tspackages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts
🧰 Additional context used
🧬 Code graph analysis (2)
packages/extension/src/libs/spark-handler/createTempFromSparkTx.ts (2)
packages/extension/src/providers/bitcoin/libs/utils.ts (1)
isSparkAddress(10-29)packages/extension/src/libs/spark-handler/constants.ts (2)
SPARK_TX_TYPE(1-1)LOCK_TIME(3-3)
packages/extension/src/providers/bitcoin/libs/electrum-client/electrum-client.ts (2)
packages/extension/src/providers/bitcoin/libs/electrum-client/abstract-electrum.ts (5)
AnonymitySetMetaModel(66-70)TransactionModel(1-6)FullTransactionModel(40-58)UnspentTxOutputModel(8-18)UsedSerialsModel(72-74)packages/extension/src/providers/bitcoin/libs/firo-wallet/firo-wallet.ts (1)
SATOSHI(39-39)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (1)
packages/extension/src/libs/spark-handler/generateSparkWallet.ts (1)
143-146: Use specific free functions for WASM-allocated objects.Lines 144-146 use generic
wasm._free()for objects that were created with specific constructor functions (js_createSpendKey,js_createFullViewKey,js_createIncomingViewKey). These objects likely require their corresponding destructor functions to properly release internal state.🔎 Proposed fix
- wasm._free(addressObj); - wasm._free(spendKeyObj); - wasm._free(fullViewKeyObj); - wasm._free(incomingViewKeyObj); + wasm._free(addressObj); + wasm.ccall('js_freeSpendKey', null, ['number'], [spendKeyObj]); + wasm.ccall('js_freeFullViewKey', null, ['number'], [fullViewKeyObj]); + wasm.ccall('js_freeIncomingViewKey', null, ['number'], [incomingViewKeyObj]);Based on learnings from the past review comment, all WASM-allocated objects should use their specific free functions rather than generic
_free().
🧹 Nitpick comments (2)
packages/extension/src/libs/spark-handler/generateSparkWallet.ts (2)
6-82: Consider adding JSDoc comments for memory management expectations.The functions
getSpendKeyObjandgetIncomingViewKeyreturn WASM object handles that require proper cleanup. Adding JSDoc comments would help clarify:
- Which object handles require cleanup
- Which cleanup functions should be used
- Lifetime expectations for returned objects
Example JSDoc
/** * Creates a Spark spend key object from the wallet's private key. * * @param wasm - The initialized WASM module instance * @returns A WASM object handle for the spend key (must be freed with js_freeSpendKey), or 0 on failure */ export const getSpendKeyObj = async (wasm: WasmModule) => { // ... }; /** * Creates full view key and incoming view key objects from a spend key. * * @param wasm - The initialized WASM module instance * @param spendKeyObj - A valid spend key object handle * @returns Object handles for full view key (free with js_freeFullViewKey) and * incoming view key (free with js_freeIncomingViewKey), or zeros on failure */ export const getIncomingViewKey = async ( wasm: WasmModule, spendKeyObj: number, ): Promise<{ fullViewKeyObj: number; incomingViewKeyObj: number }> => { // ... };
94-94: Consider making the Spark address encoding configurable for testnet support.The
is_test_networkconstant is hardcoded to0on line 94, which prevents testnet address generation for Spark. While Kadena has testnet support (e.g.,kadena-testnet.tswithisTestNetwork: true), the Spark wallet handler ignores the network context and always generates mainnet addresses. Either make this configurable based on the active network'sisTestNetworkflag, or document why Spark address generation is restricted to mainnet only.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
packages/extension/src/libs/spark-handler/generateSparkWallet.ts
🧰 Additional context used
🧬 Code graph analysis (1)
packages/extension/src/libs/spark-handler/generateSparkWallet.ts (3)
packages/extension/src/providers/bitcoin/libs/firo-wallet/public-firo-wallet.ts (1)
PublicFiroWallet(3-71)packages/extension/src/ui/action/types/account.ts (1)
SparkAccount(11-16)packages/extension/src/libs/utils/wasm-loader.ts (1)
wasmInstance(70-70)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
♻️ Duplicate comments (1)
packages/extension/src/libs/spark-handler/index.ts (1)
22-526: Critical: WASM memory leak remains unresolved.The previous critical review comment about missing WASM memory cleanup has not been fully addressed. The function still allocates numerous WASM objects without freeing them:
spendKeyObj(line 30)fullViewKeyObj,incomingViewKeyObj(lines 53-56)addressObj(line 58-63)recipientsVector(line 66-71)privateRecipientsVector(line 82-87)coinsList(line 102-107)coverSetDataMap(line 159-164)coverSetDatacreated in loop (line 234-239)deserializedCoinObjobjects indeserializedCoinList(line 199-214)idAndBlockHashesMap(line 264-269)resultobject (line 325-352)Add a
try/finallyblock to ensure all WASM-allocated resources are freed on both success and failure paths. Call the appropriatejs_free*functions for each object type.
🧹 Nitpick comments (2)
packages/extension/src/libs/spark-handler/index.ts (2)
127-128: Remove empty else block.The empty else block serves no purpose and reduces code clarity.
🔎 Proposed fix
if (!data.isUsed) { spendCoinList.push({ coin: ownedCoin.coin, setId: ownedCoin.setId, metadata: data.metaData, deserializedCoinObj: data.deserializedCoinObj, }); - } else { }
315-319: Remove or document commented-out code.Multiple blocks of commented-out code remain in the file (lines 315-319, 457-473, 488-493). If this code is needed for future reference or debugging, move it to documentation or issue comments. Otherwise, remove it to improve maintainability.
Also applies to: 457-473, 488-493
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
packages/extension/src/libs/spark-handler/generateSparkWallet.tspackages/extension/src/libs/spark-handler/index.ts
packages/extension/src/libs/spark-handler/generateSparkWallet.ts
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (1)
packages/extension/src/libs/spark-handler/index.ts (1)
320-325: Critical: Incorrect value conversion for non-Spark output.Line 323 uses
parseFloat(amount)but Bitcoin transaction values must be integers in satoshis. This is inconsistent with lines 90 and 108, which correctly useBigInt(+amount * 10 ** 8).🔎 Proposed fix
if (!isSparkTransaction) { tempTx.addOutput({ address: to, - value: parseFloat(amount), + value: Math.floor(parseFloat(amount) * 1e8), }); }
🧹 Nitpick comments (1)
packages/extension/src/libs/spark-handler/index.ts (1)
140-140: Remove commented code and empty blocks.The file contains several instances of commented-out code and an empty else block that should be cleaned up:
- Line 140: Empty else block
- Lines 327-331: Commented code with TODO
- Lines 469-485: Large commented section
- Lines 500-505: Commented fee calculation
Consider removing these to improve maintainability, or convert TODOs to tracked issues.
Also applies to: 327-331, 469-485, 500-505
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
packages/extension/src/libs/spark-handler/index.ts
🧰 Additional context used
🧬 Code graph analysis (1)
packages/extension/src/libs/spark-handler/index.ts (11)
packages/extension/src/ui/action/db/indexedDB.ts (2)
IndexedDBHelper(10-165)DB_DATA_KEYS(3-8)packages/extension/src/libs/utils/wasm-loader.ts (1)
wasmInstance(70-70)packages/extension/src/libs/spark-handler/generateSparkWallet.ts (2)
getSpendKeyObj(6-56)getIncomingViewKey(58-91)packages/extension/src/providers/bitcoin/libs/utils.ts (1)
isSparkAddress(10-29)packages/extension/src/ui/action/workers/sparkCoinInfoWorker.ts (1)
OwnedCoinData(26-33)packages/extension/src/ui/action/utils/set-utils.ts (1)
intersectSets(1-9)packages/extension/src/libs/spark-handler/getSparkCoinInfo.ts (1)
getSparkCoinInfo(20-294)packages/utils/src/debug-logger.ts (1)
error(388-392)packages/extension/src/libs/spark-handler/getSerializedCoin.ts (2)
getSerializedCoin(14-16)base64ToHex(8-12)packages/extension/src/libs/spark-handler/utils.ts (1)
base64ToReversedHex(14-23)packages/extension/src/libs/spark-handler/constants.ts (2)
SPARK_TX_TYPE(1-1)LOCK_TIME(3-3)
🔇 Additional comments (1)
packages/extension/src/libs/spark-handler/index.ts (1)
22-36: Good setup with cleanup tracking variables.The function properly declares all WASM object tracking variables at the top, which is the correct pattern for ensuring they can be cleaned up in the finally block.
56ba84f to
034e40f
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (2)
packages/extension/src/libs/spark-handler/index.ts (1)
535-558: Critical: Many WASM objects still not freed, causing memory leaks.Even when the cleanup executes, it only frees a subset of allocated WASM objects. The following objects are created but never freed:
Missing frees:
coinsList(line 113) - created viajs_createCoinsListForCreateSparkSpendTransactioncoverSetDataMap(line 170) - created viajs_createCoverSetDataMapForCreateSparkSpendTransaction- All
coverSetDatainstances (line 245) - created viajs_createCoverSetDatain the loop, one per setIdidAndBlockHashesMap(line 275) - created viajs_createIdAndBlockHashesMapForCreateSparkSpendTransactionresult(line 336) - created viajs_createSparkSpendTransaction, likely needsjs_freeCreateSparkSpendTxResult- All
deserializedCoinObjinstances (lines 210-225) - accumulated indeserializedCoinListobject, never freed- All
metadataobjects (line 135) - fromgetSparkCoinInfowithkeepMemory: true, stored inspendCoinListThis issue was flagged in previous reviews but remains unresolved.
🔎 Add tracking variables and cleanup
At the top of the function, add:
let coinsList = 0; let coverSetDataMap = 0; let idAndBlockHashesMap = 0; let result = 0; const coverSetDataList: number[] = [];When creating coverSetData (line 245), track it:
const coverSetData = Module.ccall(...); coverSetDataList.push(coverSetData); // Track for cleanupIn the finally block, add these frees:
if (coinsList) Module.ccall('js_freeCoinsListForCreateSparkSpendTransaction', null, ['number'], [coinsList]); if (coverSetDataMap) Module.ccall('js_freeCoverSetDataMapForCreateSparkSpendTransaction', null, ['number'], [coverSetDataMap]); if (idAndBlockHashesMap) Module.ccall('js_freeIdAndBlockHashesMapForCreateSparkSpendTransaction', null, ['number'], [idAndBlockHashesMap]); if (result) Module.ccall('js_freeCreateSparkSpendTxResult', null, ['number'], [result]); coverSetDataList.forEach(data => { if (data) Module.ccall('js_freeCoverSetData', null, ['number'], [data]); }); Object.values(deserializedCoinList).flat().forEach(obj => { if (obj) Module.ccall('js_freeCoin', null, ['number'], [obj]); }); spendCoinList.forEach(coin => { if (coin.metadata) Module.ccall('js_freeCSparkMintMeta', null, ['number'], [coin.metadata]); });Note: Adjust function names to match your actual WASM API exports.
packages/extension/src/libs/spark-handler/generateSparkWallet.ts (1)
84-90: Critical: Incorrect double-free and missing cleanup for incomingViewKeyObj.Two memory management issues in the error path:
Line 87: Incorrect double-free pattern - After calling
js_freeFullViewKeyon line 86, line 87 also callswasm._free(fullViewKeyObj). WASM object handles should only be freed using their specificjs_free*functions, not_free. The_freecall will corrupt the WASM heap.Missing cleanup for incomingViewKeyObj - If
incomingViewKeyObjis successfully created (line 76-81) but an error occurs afterward, it is never freed before returning.🔎 Proposed fix
} catch { if (fullViewKeyObj && fullViewKeyObj !== 0) { wasm.ccall('js_freeFullViewKey', null, ['number'], [fullViewKeyObj]); - wasm._free(fullViewKeyObj); } + if (incomingViewKeyObj && incomingViewKeyObj !== 0) { + wasm.ccall('js_freeIncomingViewKey', null, ['number'], [incomingViewKeyObj]); + } return { incomingViewKeyObj: 0, fullViewKeyObj: 0 }; }Based on learnings from past reviews, WASM object handles must only be freed via their specific
js_free*functions, and all allocated objects must be cleaned up in error paths.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
packages/extension/src/libs/spark-handler/generateSparkWallet.tspackages/extension/src/libs/spark-handler/index.ts
🧰 Additional context used
🧬 Code graph analysis (1)
packages/extension/src/libs/spark-handler/generateSparkWallet.ts (3)
packages/extension/src/providers/bitcoin/libs/firo-wallet/public-firo-wallet.ts (1)
PublicFiroWallet(3-71)packages/extension/src/ui/action/types/account.ts (1)
SparkAccount(11-16)packages/extension/src/libs/utils/wasm-loader.ts (1)
wasmInstance(70-70)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
packages/extension/src/libs/spark-handler/generateSparkWallet.ts
🧰 Additional context used
🧬 Code graph analysis (1)
packages/extension/src/libs/spark-handler/generateSparkWallet.ts (3)
packages/extension/src/providers/bitcoin/libs/firo-wallet/public-firo-wallet.ts (1)
PublicFiroWallet(3-71)packages/extension/src/ui/action/types/account.ts (1)
SparkAccount(11-16)packages/extension/src/libs/utils/wasm-loader.ts (1)
wasmInstance(70-70)
🔇 Additional comments (3)
packages/extension/src/libs/spark-handler/generateSparkWallet.ts (3)
6-56: LGTM!The memory management is correct: intermediate allocations (
spendKeyDataObj,keyDataPtr) are cleaned up in thefinallyblock, while the returnedspendKeyObjhandle is left for the caller to manage. The guarded cleanup with null/zero checks prevents crashes on early failures.
1-6: The proposed fix is incorrect —WasmModulecannot be imported fromwasm-loader.ts.
WasmModuleis declared as a global interface inwasm-loader.ts(lines 6-31), not exported as a named export. It should be available globally without an explicit import. The suggested diff would fail at compilation becausewasm-loader.tsdoes not exportWasmModule.Likely an incorrect or invalid review comment.
128-133: No issue identified.BigIntis the correct approach for passing 64-bit integers to Emscripten'sccall. This pattern is used consistently throughout the spark-handler codebase (e.g.,getMintTxData.ts,index.ts) and is supported by modern Emscripten versions for parameters mapping to C'sint64_t/uint64_ttypes. Theccalltype signature acceptsany[]for arguments, and no errors or type coercion issues are present.Likely an incorrect or invalid review comment.
packages/extension/src/libs/spark-handler/generateSparkWallet.ts
Outdated
Show resolved
Hide resolved
0abb136 to
35d8e3d
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
♻️ Duplicate comments (5)
packages/extension/src/libs/spark-handler/generateSparkWallet.ts (1)
167-167: Inconsistent WASM function name with underscore prefix.Line 167 calls
_js_freeAddresswith an underscore prefix, which is inconsistent with all other free functions in this file (js_freeSpendKey,js_freeFullViewKey,js_freeIncomingViewKey) and with the same call inindex.tsat line 536 (js_freeAddresswithout underscore).This issue was flagged in previous reviews and marked as addressed, but the underscore prefix remains in the code.
🔎 Proposed fix
- wasm.ccall('_js_freeAddress', null, ['number'], [addressObj]); + wasm.ccall('js_freeAddress', null, ['number'], [addressObj]);packages/extension/src/libs/spark-handler/index.ts (4)
22-569: Critical: Cleanup code not wrapped in try-finally block.The cleanup code (lines 521-567) runs unconditionally at the end of the function, but there's no try-finally structure wrapping the main logic. If the function throws an error or returns early (e.g., validation failures, WASM call failures), none of the cleanup executes, causing severe memory leaks.
This issue was extensively flagged in previous reviews but remains unresolved.
🔎 Recommended structure
export async function sendFromSparkAddress( network: bitcoin.Network, to: string, amount: string, ): Promise<string | undefined> { let spendKeyObj = 0; let fullViewKeyObj = 0; // ... other tracking variables try { // All main logic here (lines 43-518) return finalTx; } finally { // Move all cleanup here (lines 521-567) // Wrap in try-catch to prevent cleanup errors from masking original errors try { if (spendKeyObj && spendKeyObj !== 0) { Module.ccall('js_freeSpendKey', null, ['number'], [spendKeyObj]); } // ... rest of cleanup } catch (e) { console.error('Cleanup error:', e); } } }
196-232: Critical: All deserialized coin objects not freed.Lines 216-226 create multiple
deserializedCoinObjinstances (one per coin) and store them in thedeserializedCoinListobject. However, the cleanup at lines 558-560 only frees the singledeserializedCoinObjtracking variable, leaving all the instances stored indeserializedCoinListunfreed.This was flagged in previous reviews but remains unresolved.
🔎 Proposed fix
Add to the cleanup section:
if (deserializedCoinObj) { Module._free(deserializedCoinObj); } + // Free all deserialized coin objects stored in the list + Object.values(deserializedCoinList).flat().forEach(coinObj => { + if (coinObj && coinObj !== 0) { + Module.ccall('js_freeCoin', null, ['number'], [coinObj]); + } + }); if (result && result !== 0) { Module.ccall('js_freeCreateSparkSpendTxResult', null, ['number'], [result]); }Note: Verify the correct WASM free function name for deserialized coins (might be
js_freeCoin,js_freeDeserializedCoin, or similar).Also applies to: 558-560
128-151: Critical: Metadata objects never freed.Line 141 stores
metadatahandles fromgetSparkCoinInfoin thespendCoinListarray, but these WASM objects are never freed in the cleanup section. Each coin's metadata handle represents allocated WASM memory that will leak.This was flagged in previous reviews but remains unresolved.
🔎 Proposed fix
Add to the cleanup section:
if (coverSetData && coverSetData !== 0) { Module.ccall('js_freeCoverSetData', null, ['number'], [coverSetData]); } + // Free all metadata objects + spendCoinList.forEach(coin => { + if (coin.metadata && coin.metadata !== 0) { + Module.ccall('js_freeCSparkMintMeta', null, ['number'], [coin.metadata]); + } + }); return finalTx; }Note: Verify the correct WASM free function name for metadata objects (might be
js_freeCSparkMintMeta,js_freeMetadata, or similar).Also applies to: 521-567
237-272: Critical: All coverSetData objects not freed.Lines 248-253 create a
coverSetDataobject inside the loop overgroupedBySet(one per setId), but the cleanup at lines 564-566 only frees the single tracking variable. AllcoverSetDatainstances created in the loop remain unfreed.This was flagged in previous reviews but remains unresolved.
🔎 Proposed fix
Track all coverSetData instances:
let deserializedCoinObj = 0; let coverSetData = 0; + const coverSetDataList: number[] = []; const diversifier = 1n; // ... later in the loop: coverSetData = Module.ccall( 'js_createCoverSetData', 'number', ['number', 'number'], [coverSetRepresentationPointer, coverSetRepresentation.length], ); + coverSetDataList.push(coverSetData); // ... in cleanup section: if (coverSetData && coverSetData !== 0) { Module.ccall('js_freeCoverSetData', null, ['number'], [coverSetData]); } + // Free all coverSetData from loop + coverSetDataList.forEach(data => { + if (data && data !== 0) { + Module.ccall('js_freeCoverSetData', null, ['number'], [data]); + } + });Also applies to: 564-566
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
packages/extension/src/libs/spark-handler/generateSparkWallet.tspackages/extension/src/libs/spark-handler/index.ts
🧰 Additional context used
🧬 Code graph analysis (2)
packages/extension/src/libs/spark-handler/index.ts (8)
packages/extension/src/libs/spark-handler/generateSparkWallet.ts (2)
getSpendKeyObj(6-56)getIncomingViewKey(58-102)packages/extension/src/providers/bitcoin/libs/utils.ts (1)
isSparkAddress(10-29)packages/extension/src/ui/action/workers/sparkCoinInfoWorker.ts (1)
OwnedCoinData(26-33)packages/extension/src/ui/action/utils/set-utils.ts (1)
intersectSets(1-9)packages/extension/src/libs/spark-handler/getSparkCoinInfo.ts (1)
getSparkCoinInfo(20-294)packages/extension/src/libs/spark-handler/getSerializedCoin.ts (1)
getSerializedCoin(14-16)packages/extension/src/libs/spark-handler/utils.ts (2)
base64ToReversedHex(14-23)numberToReversedHex(1-12)packages/extension/src/libs/spark-handler/constants.ts (2)
SPARK_TX_TYPE(1-1)LOCK_TIME(3-3)
packages/extension/src/libs/spark-handler/generateSparkWallet.ts (3)
packages/extension/src/providers/bitcoin/libs/firo-wallet/public-firo-wallet.ts (1)
PublicFiroWallet(3-71)packages/extension/src/ui/action/types/account.ts (1)
SparkAccount(11-16)packages/extension/src/libs/utils/wasm-loader.ts (1)
wasmInstance(70-70)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (4)
packages/extension/src/libs/spark-handler/index.ts (4)
200-234: Critical: Memory leaks in coin deserialization loop - malloc'd pointers never freed.Inside the nested loops (lines 200-234),
serializedCoinPointer(line 207) andserialContextPointer(line 214) are allocated withModule._mallocon every iteration but are never freed. This causes severe memory leaks proportional to the number of coins processed.This exact issue was flagged in previous reviews but remains unresolved.
🔎 Proposed fix: Free malloc'd pointers after use
for (const set in groupedBySet) { const fullSet = await db.getSetById(Number(set)); fullSet?.coins.forEach(coin => { const serializedCoin = getSerializedCoin( coin[0] as string, ) as unknown as ArrayLike<number>; const serializedCoinPointer = Module._malloc(serializedCoin.length); Module.HEAPU8.set(serializedCoin, serializedCoinPointer); const serialContext = getSerializedCoin( coin[2] as string, ) as unknown as ArrayLike<number>; const serialContextPointer = Module._malloc(serialContext.length); Module.HEAPU8.set(serialContext, serialContextPointer); deserializedCoinObj = Module.ccall( 'js_deserializeCoin', 'number', ['number', 'number', 'number', 'number'], [ serializedCoinPointer, serializedCoin.length, serialContextPointer, serialContext.length, ], ); + + // Free malloc'd buffers immediately after use + Module._free(serializedCoinPointer); + Module._free(serialContextPointer); if (!deserializedCoinList[set]) { deserializedCoinList[set] = []; } deserializedCoinList[set].push(deserializedCoinObj); }); }
129-152: Critical: Metadata objects in spendCoinList never freed.Lines 129-152 create metadata objects via
getSparkCoinInfowithkeepMemory: true, storing them inspendCoinList(line 142). These WASM objects accumulate but are never freed in the cleanup section, causing memory leaks proportional to the number of owned coins.This was flagged in previous reviews but remains unresolved.
🔎 Proposed fix: Free metadata objects in cleanup
Add to the cleanup section (in the finally block):
if (coverSetData && coverSetData !== 0) { Module.ccall('js_freeCoverSetData', null, ['number'], [coverSetData]); } + + // Free all metadata objects from spendCoinList + spendCoinList.forEach(coin => { + if (coin.metadata) { + Module.ccall('js_freeCSparkMintMeta', null, ['number'], [coin.metadata]); + } + });Note: Adjust the free function name (
js_freeCSparkMintMeta) to match your WASM API.
574-576: Critical: Only last deserializedCoinObj freed - all others in deserializedCoinList leak.Line 575 frees only the single
deserializedCoinObjvariable (the last one created in the loop), butdeserializedCoinList(populated at line 232) contains all deserialized coin objects created across all sets. These accumulated WASM objects are never freed, causing severe memory leaks.This was flagged in previous reviews but remains unresolved.
🔎 Proposed fix: Free all deserialized coins
Replace lines 574-576 with:
- if (deserializedCoinObj) { - Module._free(deserializedCoinObj); - } + // Free all deserialized coin objects + Object.values(deserializedCoinList).flat().forEach(coinObj => { + if (coinObj) { + Module.ccall('js_freeCoin', null, ['number'], [coinObj]); + } + });Note: Adjust the free function name (
js_freeCoin) to match your WASM API.
580-582: Critical: Only last coverSetData freed - all others created in loop leak.Lines 249-254 create a new
coverSetDataobject on every iteration of the loop (one per setId ingroupedBySet), but line 581 only frees the last one created. All previous instances leak.This was flagged in previous reviews but remains unresolved.
🔎 Proposed fix: Track and free all coverSetData objects
At the top of the function, add a tracking array:
let deserializedCoinObj = 0; let coverSetData = 0; + const coverSetDataList: number[] = []; const diversifier = 1n;In the loop, track each created object:
coverSetData = Module.ccall( 'js_createCoverSetData', 'number', ['number', 'number'], [coverSetRepresentationPointer, coverSetRepresentation.length], ); + coverSetDataList.push(coverSetData);In cleanup, free all:
- if (coverSetData && coverSetData !== 0) { - Module.ccall('js_freeCoverSetData', null, ['number'], [coverSetData]); - } + // Free all cover set data objects + coverSetDataList.forEach(data => { + if (data && data !== 0) { + Module.ccall('js_freeCoverSetData', null, ['number'], [data]); + } + });
| export async function sendFromSparkAddress( | ||
| network: bitcoin.Network, | ||
| to: string, | ||
| amount: string, | ||
| ): Promise<string | undefined> { | ||
| let spendKeyObj = 0; | ||
| let fullViewKeyObj = 0; | ||
| let incomingViewKeyObj = 0; | ||
| let addressObj = 0; | ||
| let recipientsVector = 0; | ||
| let privateRecipientsVector = 0; | ||
| let coinsList = 0; | ||
| let coverSetDataMap = 0; | ||
| let idAndBlockHashesMap = 0; | ||
| let result = 0; | ||
| let deserializedCoinObj = 0; | ||
| let coverSetData = 0; | ||
| const diversifier = 1n; | ||
| const db = new IndexedDBHelper(); | ||
| const Module = await wasmInstance.getInstance(); | ||
|
|
||
| spendKeyObj = await getSpendKeyObj(Module); | ||
| const isSparkTransaction = await isSparkAddress(to); | ||
|
|
||
| const ownedCoins = ((await db.readData('myCoins')) || | ||
| []) as OwnedCoinData[]; | ||
|
|
||
| const usedCoinsTags = await db.readData<{ tags: string[] }>( | ||
| DB_DATA_KEYS.usedCoinsTags, | ||
| ); | ||
| const coinsTagsSet = new Set(usedCoinsTags.tags); | ||
| const myCoinsTagsSet = new Set((ownedCoins ?? []).map(coin => coin.tag)); | ||
|
|
||
| const usedMyCoinsTagsSet = intersectSets(coinsTagsSet, myCoinsTagsSet); | ||
| const revalidatedCoins = ownedCoins.filter(ownedCoin => { | ||
| return !usedMyCoinsTagsSet.has(ownedCoin.tag); | ||
| }); | ||
|
|
||
| const spendCoinList: { | ||
| coin: string[]; | ||
| setId: number; | ||
| metadata: number; | ||
| deserializedCoinObj: number; | ||
| }[] = []; | ||
|
|
||
| const keyObjects = await getIncomingViewKey( | ||
| Module, | ||
| spendKeyObj, | ||
| ); | ||
|
|
||
| incomingViewKeyObj = keyObjects.incomingViewKeyObj | ||
| fullViewKeyObj = keyObjects.fullViewKeyObj | ||
|
|
||
| addressObj = Module.ccall( | ||
| 'js_getAddress', | ||
| 'number', | ||
| ['number', 'number'], | ||
| [incomingViewKeyObj, diversifier], | ||
| ); | ||
|
|
||
| // Create recipients vector for spend transaction | ||
| recipientsVector = Module.ccall( | ||
| 'js_createRecipientsVectorForCreateSparkSpendTransaction', | ||
| 'number', | ||
| ['number'], | ||
| [1], // intended final size | ||
| ); | ||
|
|
||
| if (!isSparkTransaction) { | ||
| Module.ccall( | ||
| 'js_addRecipientForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number', 'number', 'number'], | ||
| [recipientsVector, BigInt(+amount * 10 ** 8), 0], | ||
| ); | ||
| } | ||
|
|
||
| privateRecipientsVector = Module.ccall( | ||
| 'js_createPrivateRecipientsVectorForCreateSparkSpendTransaction', | ||
| 'number', | ||
| ['number'], | ||
| [1], // intended final size | ||
| ); | ||
|
|
||
| Module.ccall( | ||
| 'js_addPrivateRecipientForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number', 'number', 'number', 'string', 'number'], | ||
| [ | ||
| privateRecipientsVector, | ||
| addressObj, | ||
| BigInt(+amount * 10 ** 8), | ||
| 'Private memo', | ||
| 1, | ||
| ], | ||
| ); | ||
|
|
||
| coinsList = Module.ccall( | ||
| 'js_createCoinsListForCreateSparkSpendTransaction', | ||
| 'number', | ||
| [], | ||
| [], | ||
| ); | ||
|
|
||
| const coinMetaPromiseList: Promise<void>[] = []; | ||
|
|
||
| revalidatedCoins.forEach(ownedCoin => { | ||
| const myCoinMetaData = getSparkCoinInfo({ | ||
| coin: ownedCoin.coin, | ||
| fullViewKeyObj, | ||
| incomingViewKeyObj, | ||
| wasmModule: Module, | ||
| keepMemory: true, | ||
| }) | ||
| .then(data => { | ||
| if (!data.isUsed) { | ||
| spendCoinList.push({ | ||
| coin: ownedCoin.coin, | ||
| setId: ownedCoin.setId, | ||
| metadata: data.metaData, | ||
| deserializedCoinObj: data.deserializedCoinObj, | ||
| }); | ||
| } else { | ||
| } | ||
| }) | ||
| .catch(err => { | ||
| console.error('Error getting spark coin info', err); | ||
| }); | ||
| coinMetaPromiseList.push(myCoinMetaData); | ||
| }); | ||
|
|
||
| await Promise.allSettled(coinMetaPromiseList); | ||
|
|
||
| spendCoinList.forEach(spendCoin => { | ||
| Module.ccall( | ||
| 'js_setCSparkMintMetaId', // C++ function name | ||
| null, // Return type | ||
| ['number', 'number'], // Argument types | ||
| [spendCoin.metadata, spendCoin.setId], | ||
| ); | ||
| }); | ||
|
|
||
| try { | ||
| spendCoinList.forEach(spendCoin => { | ||
| Module.ccall( | ||
| 'js_addCoinToListForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number', 'number'], | ||
| [coinsList, spendCoin.metadata], | ||
| ); | ||
| }); | ||
| } catch (error) { | ||
| console.error('Error adding coins to list:', error); | ||
| } | ||
| coverSetDataMap = Module.ccall( | ||
| 'js_createCoverSetDataMapForCreateSparkSpendTransaction', | ||
| 'number', | ||
| [], | ||
| [], | ||
| ); | ||
|
|
||
| const groupedBySet = spendCoinList.reduce( | ||
| (acc, coin) => { | ||
| if (!acc[coin.setId]) { | ||
| acc[coin.setId] = []; | ||
| } | ||
| acc[coin.setId].push(coin); | ||
| return acc; | ||
| }, | ||
| {} as Record<number, typeof spendCoinList>, | ||
| ); | ||
|
|
||
| console.debug('groupedBySet', groupedBySet); | ||
|
|
||
| const deserializedCoinList: Record<string, number[]> = {}; | ||
| // TODO: move to separate function | ||
|
|
||
| for (const set in groupedBySet) { | ||
| const fullSet = await db.getSetById(Number(set)); | ||
| fullSet?.coins.forEach(coin => { | ||
| const serializedCoin = getSerializedCoin( | ||
| coin[0] as string, | ||
| ) as unknown as ArrayLike<number>; | ||
|
|
||
| const serializedCoinPointer = Module._malloc(serializedCoin.length); | ||
| Module.HEAPU8.set(serializedCoin, serializedCoinPointer); | ||
|
|
||
| const serialContext = getSerializedCoin( | ||
| coin[2] as string, | ||
| ) as unknown as ArrayLike<number>; | ||
|
|
||
| const serialContextPointer = Module._malloc(serialContext.length); | ||
| Module.HEAPU8.set(serialContext, serialContextPointer); | ||
|
|
||
| deserializedCoinObj = Module.ccall( | ||
| 'js_deserializeCoin', | ||
| 'number', | ||
| ['number', 'number', 'number', 'number'], | ||
| [ | ||
| serializedCoinPointer, | ||
| serializedCoin.length, | ||
| serialContextPointer, | ||
| serialContext.length, | ||
| ], | ||
| ); | ||
|
|
||
| if (!deserializedCoinList[set]) { | ||
| deserializedCoinList[set] = []; | ||
| } | ||
| deserializedCoinList[set].push(deserializedCoinObj); | ||
| }); | ||
| } | ||
|
|
||
| const setHashList = await db.getSetHashes(); | ||
|
|
||
| for (const setId in groupedBySet) { | ||
| const coverSetRepresentation = Buffer.from( | ||
| setHashList[+setId - 1], | ||
| 'base64', | ||
| ); | ||
| console.debug('coverSetRepresentation :=>', coverSetRepresentation); | ||
| const coverSetRepresentationPointer = Module._malloc( | ||
| coverSetRepresentation.length, | ||
| ); | ||
| Module.HEAPU8.set(coverSetRepresentation, coverSetRepresentationPointer); | ||
|
|
||
| coverSetData = Module.ccall( | ||
| 'js_createCoverSetData', | ||
| 'number', | ||
| ['number', 'number'], | ||
| [coverSetRepresentationPointer, coverSetRepresentation.length], | ||
| ); | ||
|
|
||
| console.debug(groupedBySet, setId); | ||
| deserializedCoinList[setId].forEach(deserializedCoin => { | ||
| Module.ccall( | ||
| 'js_addCoinToCoverSetData', | ||
| null, | ||
| ['number', 'number'], | ||
| [coverSetData, deserializedCoin], | ||
| ); | ||
| // console.count('Deserialized coin added'); | ||
| }); | ||
|
|
||
| // Add cover set data to map (with group ID 1) | ||
| Module.ccall( | ||
| 'js_addCoverSetDataForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number', 'number', 'number'], | ||
| [coverSetDataMap, BigInt(setId), coverSetData], | ||
| ); | ||
| } | ||
|
|
||
| idAndBlockHashesMap = Module.ccall( | ||
| 'js_createIdAndBlockHashesMapForCreateSparkSpendTransaction', | ||
| 'number', | ||
| [], | ||
| [], | ||
| ); | ||
|
|
||
| const blockHashList = await db.getBlockHashes(); | ||
| console.log('blockHashList =>>>', blockHashList); | ||
| for (const setId in groupedBySet) { | ||
| console.log(BigInt(setId), base64ToHex(blockHashList[+setId - 1])); | ||
| Module.ccall( | ||
| 'js_addIdAndBlockHashForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number', 'number', 'string'], | ||
| [ | ||
| idAndBlockHashesMap, | ||
| BigInt(setId), | ||
| base64ToReversedHex(blockHashList[+setId - 1]), | ||
| ], | ||
| ); | ||
| } | ||
|
|
||
| const tempTx = new bitcoin.Psbt({ network: network.networkInfo }); | ||
| tempTx.setVersion(3 | (SPARK_TX_TYPE << 16)); // version 3 and tx type in high bits (3 | (SPARK_TX_TYPE << 16)); | ||
| tempTx.setLocktime(LOCK_TIME); // new Date().getTime() / 1000 | ||
|
|
||
| tempTx.addInput({ | ||
| hash: '0000000000000000000000000000000000000000000000000000000000000000', | ||
| index: 4294967295, | ||
| sequence: 4294967295, | ||
| finalScriptSig: Buffer.from('d3', 'hex'), | ||
| }); | ||
|
|
||
| const tempTxBuffer = tempTx.extractTransaction(true).toBuffer(); | ||
| const extendedTempTxBuffer = Buffer.concat([ | ||
| tempTxBuffer, | ||
| Buffer.from([0x00]), | ||
| ]); | ||
|
|
||
| const txHash = bitcoin.crypto.hash256(extendedTempTxBuffer); | ||
| const txHashSig = txHash.reverse().toString('hex'); | ||
|
|
||
| // TODO: check not spark case | ||
| if (!isSparkTransaction) { | ||
| tempTx.addOutput({ | ||
| address: to, | ||
| value: new BigNumber(amount).multipliedBy(10 ** 8).toNumber(), | ||
| }); | ||
| } | ||
|
|
||
| //tempTx// tx.signInput(0, spendKeyObj); // ? how to sign? Is I need to sign wit all utxo keypairs? // ? how to add subtractFeeFromAmount? // ? what is spend transaction type? // https://github.com/firoorg/sparkmobile/blob/main/include/spark.h#L22 | ||
| // tx.finalizeAllInputs(); | ||
| // const txHash = tx.extractTransaction() | ||
|
|
||
| // const txHashSig = txHash.getHash() | ||
|
|
||
| console.log('coinlist', coinsList); | ||
| const additionalTxSize = 0; | ||
|
|
||
| // Create the spend transaction | ||
| result = Module.ccall( | ||
| 'js_createSparkSpendTransaction', | ||
| 'number', | ||
| [ | ||
| 'number', | ||
| 'number', | ||
| 'number', | ||
| 'number', | ||
| 'number', | ||
| 'number', | ||
| 'number', | ||
| 'number', | ||
| 'string', | ||
| 'number', | ||
| ], | ||
| [ | ||
| spendKeyObj, | ||
| fullViewKeyObj, | ||
| incomingViewKeyObj, | ||
| recipientsVector, | ||
| privateRecipientsVector, | ||
| coinsList, | ||
| coverSetDataMap, | ||
| idAndBlockHashesMap, | ||
| txHashSig, | ||
| additionalTxSize, | ||
| ], | ||
| ); | ||
|
|
||
| console.log('final result is', result); | ||
|
|
||
| const serializedSpend = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultSerializedSpend', | ||
| 'number', // returns a pointer to the beginning of a byte array | ||
| ['number'], | ||
| [result], | ||
| ); | ||
|
|
||
| console.log('serializedSpend ==> ==>', serializedSpend); | ||
|
|
||
| const serializedSpendSize = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultSerializedSpendSize', | ||
| 'number', | ||
| ['number'], | ||
| [result], | ||
| ); | ||
|
|
||
| console.log(`Serialized spend size: `, serializedSpendSize); | ||
|
|
||
| const serializedSpendBytes = new Uint8Array( | ||
| Module.HEAPU8.buffer, | ||
| serializedSpend, | ||
| serializedSpendSize, | ||
| ); | ||
|
|
||
| // Make a copy (optional, because the above is just a view into WASM memory) | ||
| const spendDataCopy = new Uint8Array(serializedSpendBytes); | ||
|
|
||
| // If you need hex | ||
| const hex = Array.from(spendDataCopy) | ||
| .map(b => b.toString(16).padStart(2, '0')) | ||
| .join(''); | ||
|
|
||
| console.log('Serialized Spend (hex):', hex); | ||
|
|
||
| const outputScriptsSize = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultOutputScriptsSize', | ||
| 'number', | ||
| ['number'], | ||
| [result], | ||
| ); | ||
| const scripts = []; | ||
|
|
||
| // Get each output script | ||
| for (let i = 0; i < outputScriptsSize; i++) { | ||
| const scriptPtr = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultOutputScriptAt', | ||
| 'number', // returns a pointer to the beginning of a byte array | ||
| ['number', 'number'], | ||
| [result, i], | ||
| ); | ||
|
|
||
| console.log(`Output script in for:`, scriptPtr); | ||
| const scriptSize = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultOutputScriptSizeAt', | ||
| 'number', | ||
| ['number', 'number'], | ||
| [result, i], | ||
| ); | ||
| console.log(`Output script ${i} size: ${scriptSize}`); | ||
|
|
||
| const script = new Uint8Array(Module.HEAPU8.buffer, scriptPtr, scriptSize); | ||
|
|
||
| scripts.push(script); | ||
| } | ||
|
|
||
| // Get spent coins information | ||
| const spentCoinsSize = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultSpentCoinsSize', | ||
| 'number', | ||
| ['number'], | ||
| [result], | ||
| ); | ||
|
|
||
| console.log(`Spent coins size: ${spentCoinsSize}`); | ||
|
|
||
| for (let i = 0; i < spentCoinsSize; i++) { | ||
| const spentCoinMeta = Module.ccall( | ||
| 'js_getCreateSparkSpendTxResultSpentCoinAt', | ||
| 'number', | ||
| ['number', 'number'], | ||
| [result, i], | ||
| ); | ||
| const spentCoinValue = Module.ccall( | ||
| 'js_getCSparkMintMetaValue', | ||
| 'number', | ||
| ['number'], | ||
| [spentCoinMeta], | ||
| ); | ||
|
|
||
| // const coinPtr = Module.ccall( | ||
| // '_js_getCoinFromMeta', | ||
| // 'number', | ||
| // ['number', 'number'], | ||
| // [spentCoinMeta, incomingViewKeyObj], | ||
| // ); | ||
| // | ||
| // const hash = Module.ccall( | ||
| // '_js_getCoinHash', | ||
| // 'number', | ||
| // ['number'], | ||
| // [coinPtr], | ||
| // ); | ||
| // | ||
| // console.log( | ||
| // `coin hash =>>>>>>======>>>>><<<<<======<<<<<<<======: ${hash}`, | ||
| // ); | ||
| const spentCoinMetaDeserialized = new Uint8Array( | ||
| Module.HEAPU8.buffer, | ||
| spentCoinMeta, | ||
| spentCoinMeta.length, | ||
| ); | ||
|
|
||
| console.log( | ||
| `spend coins meta @nd value ${spentCoinValue}, ${spentCoinMeta} ${spentCoinMetaDeserialized.toString()}`, | ||
| ); | ||
| } | ||
|
|
||
| // Get transaction fee | ||
| // const fee = Module.ccall( | ||
| // 'js_getCreateSparkSpendTxResultFee', | ||
| // 'number', | ||
| // ['number'], | ||
| // [result], | ||
| // ); | ||
|
|
||
| const psbt = new bitcoin.Psbt({ network: network.networkInfo }); | ||
|
|
||
| // const api = (await network.api()) as unknown as FiroAPI; | ||
|
|
||
| psbt.addInput({ | ||
| hash: '0000000000000000000000000000000000000000000000000000000000000000', | ||
| index: 4294967295, | ||
| sequence: 4294967295, | ||
| finalScriptSig: Buffer.from('d3', 'hex'), | ||
| }); | ||
|
|
||
| psbt.setLocktime(LOCK_TIME); | ||
|
|
||
| psbt.setVersion(3 | (SPARK_TX_TYPE << 16)); | ||
| scripts.forEach(script => { | ||
| console.log('script is ==>', script); | ||
| psbt.addOutput({ | ||
| script: Buffer.from(script), | ||
| value: 0, | ||
| }); | ||
| }); | ||
|
|
||
| const rawTx = psbt.extractTransaction(); | ||
| const txHex = rawTx.toHex(); | ||
| const sizeHex = numberToReversedHex(serializedSpendSize); | ||
| const finalTx = txHex + 'fd' + sizeHex + hex; | ||
|
|
||
| console.log('Final TX to broadcast:', finalTx); | ||
|
|
||
| if (spendKeyObj && spendKeyObj !== 0) { | ||
| Module.ccall('js_freeSpendKey', null, ['number'], [spendKeyObj]); | ||
| } | ||
| if (fullViewKeyObj && fullViewKeyObj !== 0) { | ||
| Module.ccall('js_freeFullViewKey', null, ['number'], [fullViewKeyObj]); | ||
| } | ||
| if (incomingViewKeyObj && incomingViewKeyObj !== 0) { | ||
| Module.ccall( | ||
| 'js_freeIncomingViewKey', | ||
| null, | ||
| ['number'], | ||
| [incomingViewKeyObj], | ||
| ); | ||
| } | ||
| if (addressObj && addressObj !== 0) { | ||
| Module.ccall('js_freeAddress', null, ['number'], [addressObj]); | ||
| } | ||
| if (recipientsVector) { | ||
| Module.ccall( | ||
| 'js_freeSparkSpendRecipientsVector', | ||
| null, | ||
| ['number'], | ||
| [recipientsVector], | ||
| ); | ||
| } | ||
| if (coinsList) { | ||
| Module.ccall('js_freeSparkSpendCoinsList', null, ['number'], [coinsList]); | ||
| } | ||
| if (coverSetDataMap && coverSetDataMap !== 0) { | ||
| Module.ccall( | ||
| 'js_freeCoverSetDataMapForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number'], | ||
| [coverSetDataMap], | ||
| ); | ||
| } | ||
| if (privateRecipientsVector) { | ||
| Module.ccall( | ||
| 'js_freeSparkSpendPrivateRecipientsVector', | ||
| null, | ||
| ['number'], | ||
| [privateRecipientsVector], | ||
| ); | ||
| } | ||
| if (idAndBlockHashesMap) { | ||
| Module.ccall( | ||
| 'js_freeIdAndBlockHashesMap', | ||
| null, | ||
| ['number'], | ||
| [idAndBlockHashesMap], | ||
| ); | ||
| } | ||
| if (deserializedCoinObj) { | ||
| Module._free(deserializedCoinObj); | ||
| } | ||
| if (result && result !== 0) { | ||
| Module.ccall('js_freeCreateSparkSpendTxResult', null, ['number'], [result]); | ||
| } | ||
| if (coverSetData && coverSetData !== 0) { | ||
| Module.ccall('js_freeCoverSetData', null, ['number'], [coverSetData]); | ||
| } | ||
|
|
||
| return finalTx; | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: Still no try/finally block - cleanup doesn't run on errors or early returns.
Despite multiple past reviews flagging this critical issue, the function still lacks a try/finally block. All cleanup code (lines 522-582) runs only if execution reaches that point. If any error occurs earlier (e.g., WASM call fails, IndexedDB operation fails, Promise rejection), none of the cleanup executes, leaking all allocated WASM resources.
This is a blocking issue that must be fixed before merge.
🔎 Required fix: Wrap function body in try/finally
export async function sendFromSparkAddress(
network: bitcoin.Network,
to: string,
amount: string,
): Promise<string | undefined> {
let spendKeyObj = 0;
let fullViewKeyObj = 0;
let incomingViewKeyObj = 0;
let addressObj = 0;
let recipientsVector = 0;
let privateRecipientsVector = 0;
let coinsList = 0;
let coverSetDataMap = 0;
let idAndBlockHashesMap = 0;
let result = 0;
let deserializedCoinObj = 0;
let coverSetData = 0;
const diversifier = 1n;
const db = new IndexedDBHelper();
const Module = await wasmInstance.getInstance();
+
+ try {
spendKeyObj = await getSpendKeyObj(Module);
const isSparkTransaction = await isSparkAddress(to);
// ... all existing logic ...
console.log('Final TX to broadcast:', finalTx);
+
+ return finalTx;
+ } finally {
+ // Move ALL cleanup here (lines 522-582)
+ try {
if (spendKeyObj && spendKeyObj !== 0) {
Module.ccall('js_freeSpendKey', null, ['number'], [spendKeyObj]);
}
// ... all other cleanup ...
+ } catch (e) {
+ console.error('Cleanup error:', e);
+ }
+ }
-
- return finalTx;
}Based on learnings from past reviews, proper WASM memory management requires function-level try/finally to prevent leaks in error cases.
Committable suggestion skipped: line range outside the PR's diff.
| for (const setId in groupedBySet) { | ||
| const coverSetRepresentation = Buffer.from( | ||
| setHashList[+setId - 1], | ||
| 'base64', | ||
| ); | ||
| console.debug('coverSetRepresentation :=>', coverSetRepresentation); | ||
| const coverSetRepresentationPointer = Module._malloc( | ||
| coverSetRepresentation.length, | ||
| ); | ||
| Module.HEAPU8.set(coverSetRepresentation, coverSetRepresentationPointer); | ||
|
|
||
| coverSetData = Module.ccall( | ||
| 'js_createCoverSetData', | ||
| 'number', | ||
| ['number', 'number'], | ||
| [coverSetRepresentationPointer, coverSetRepresentation.length], | ||
| ); | ||
|
|
||
| console.debug(groupedBySet, setId); | ||
| deserializedCoinList[setId].forEach(deserializedCoin => { | ||
| Module.ccall( | ||
| 'js_addCoinToCoverSetData', | ||
| null, | ||
| ['number', 'number'], | ||
| [coverSetData, deserializedCoin], | ||
| ); | ||
| // console.count('Deserialized coin added'); | ||
| }); | ||
|
|
||
| // Add cover set data to map (with group ID 1) | ||
| Module.ccall( | ||
| 'js_addCoverSetDataForCreateSparkSpendTransaction', | ||
| null, | ||
| ['number', 'number', 'number'], | ||
| [coverSetDataMap, BigInt(setId), coverSetData], | ||
| ); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: Memory leak in cover set loop - coverSetRepresentationPointer never freed.
Inside the loop (lines 238-274), coverSetRepresentationPointer is allocated with Module._malloc (lines 244-247) on every iteration but is never freed. This leaks memory proportional to the number of sets.
This was flagged in previous reviews but remains unresolved.
🔎 Proposed fix: Free malloc'd pointer after use
for (const setId in groupedBySet) {
const coverSetRepresentation = Buffer.from(
setHashList[+setId - 1],
'base64',
);
console.debug('coverSetRepresentation :=>', coverSetRepresentation);
const coverSetRepresentationPointer = Module._malloc(
coverSetRepresentation.length,
);
Module.HEAPU8.set(coverSetRepresentation, coverSetRepresentationPointer);
coverSetData = Module.ccall(
'js_createCoverSetData',
'number',
['number', 'number'],
[coverSetRepresentationPointer, coverSetRepresentation.length],
);
+
+ // Free malloc'd buffer immediately after use
+ Module._free(coverSetRepresentationPointer);
console.debug(groupedBySet, setId);
deserializedCoinList[setId].forEach(deserializedCoin => {
Module.ccall(
'js_addCoinToCoverSetData',
null,
['number', 'number'],
[coverSetData, deserializedCoin],
);
});
Module.ccall(
'js_addCoverSetDataForCreateSparkSpendTransaction',
null,
['number', 'number', 'number'],
[coverSetDataMap, BigInt(setId), coverSetData],
);
}🤖 Prompt for AI Agents
In packages/extension/src/libs/spark-handler/index.ts around lines 238 to 274,
the pointer coverSetRepresentationPointer allocated with Module._malloc on each
iteration is never freed, causing a memory leak; after you finish using the
buffer to create coverSetData (i.e., immediately after the
Module.ccall('js_createCoverSetData', ...) that consumes the memory), call
Module._free(coverSetRepresentationPointer) to release the allocation, and
ensure this free happens in all code paths (including error/early-return paths)
inside the loop so each iteration cleans up its malloc'd pointer.
Summary by CodeRabbit
New Features
Improvements
Chores
✏️ Tip: You can customize this high-level summary in your review settings.