Game Making & Editing FAQ/Q&A

Below are answers to common questions and problems, or Frequently Asked Questions, that often go unanswered that User Generated Content creators or game developers need answering that haven't been elsewhere. For a full list click here »

Is it Worth Mining Bitcoin

May 08, 2021, 04:36:38 PM by kat
Given the buzz surrounding bitcoin and cryptocurrencies generally it's only natural to ask "is it really worth mining for bitcoin" and just what is "the truth about bitcoin mining". Well, TL:DR; it's not really worth doing except for the fun of it, the experience, or as an exercise in it's own right [00], largely because mining is now done at such a colossally industrial process, all that remains for the individual are the scraps that go unnoticed, largely because they're not worth picking up.

So, is it really worth mining for bitcoin using a personal computer?

Short answer is "no".

Long answer is "not really".

Assuming the question is being asked from the average normies point of view, a person just wanting to use their computer, laptop or mobile device to mine rather than hardcore miners using dedicated ASIC hardware designed specifically for the job, before mining for bitcoin the digital prospector has to consider the following;
  • The cost of electricity needed to run the equipment.
  • The (current & future) price of bitcoin (or whatever hash is being mined, the 'coin' herein).
While both concerns are important for figuring out whether it's worth mining for bitcoin, the first, the cost of electricity, has greater influence over a decision to mine than the latter, the price of bitcoin, because power prices determine running costs of mining equipment, which in turn defines the break-even point for resulting return of investment/value (RoI/RoV).

Simply put, if it costs more to run the hardware used to mine coins than current rewards from mining then it's simply not worth doing, except as an exercise in its own right, because the prospector is more-or-less always then running at a loss.

In other words then, as of writing the only way it would be cost effective, even 'profitable', to mine bitcoin is through use of 'free' electricity, not in the ‘someone else is paying for it’ sense, but rather power that is supplied locally, perhaps as self-owned solar, wind, water etc., anything otherwise generated 'off-grid' or in the proverbial backyard. If this is the case then there is no 'cost' against which mining needs to be leveraged (notwithstanding costs associated with maintenance and upkeep of all the hardware used in the process, from solar panels to graphics card et al) so bitcoin mining can be done full steam ahead.

With that said, in terms of profitability/RoI/RoV, the aforementioned is compounded further by;
  • The rate at which mining occurs [1].
  • How much reward there is for doing said mining [2].
  • If there even is a reward for doing so [3].
What this all means in practice is this; coins are generally highly divisible by nature, some by as many as twelve decimal places e.g. 0.000000000000 or to the trillionth decimal. So whist their availability may be exponentially greater than other (fiat) physical currencies like the US Dollar, it means the potential share per mining session is going to be relatively small, even accounting for inbuilt scarcity that underwrites the value of cryptocurrency as a whole. In other words, whilst a successfully mining session might return 35,000 'points', if the sessions results are even accepted on the network, that might translate to earning 0.000000035000’s (35 ten-millionth’s) of a coin. If said coin has a market value of $100.00, the 35,000 rewarded 'points' would be worth fractions of a fraction of a cent (tens or hundreds-of-millionth’s or billionth’s) – the coin itself would have to be worth high millions, even billions or trillions for any mined points to be reasonably convertible to cash of any value for the costs and effort used to mine it.

And this is where the dilemma occurs for ordinary people wanting to mine bitcoins with their computers or devices, it’s simply no longer economically feasible if power is being fed in from the grid and has to be paid for; it will cost more to mine than can be earned from mining [4].

The bottom line on mining for bitcoin these days is this; it’s potentially more profitable to simply buy and trade bitcoin or hold on to it to capitalise on potential gains from exchange/commodity increases, especially as Governments the globe over are turning to digital currencies knowing their cash reserves are falling well short of their debt obligations.

[00] There is another reason for mining and that's to acquire coins in the present as a bet against a future price rise of the coin mined. In many ways mining cryptocurrencies with this in mind is akin to purchasing stocks or shares, owning gold, silver etc., or owning other price-based commodities where gains can be had if the price of the item held increases. For crypto this essentially means betting the current costs of mining against future price increases. Cryptocurrency mining with this in mind generally means being more selective with coin choice as not all crypto is equal, the more popular a coin is now the less profitable it is in the present.

[1] Mining bitcoins tends to cycle hardware to its limit – looking at Device Manager in Windows for example, will typically show mining software running at between 70% and 95% CPU use depending on where the calculations are in the process and/or whether the computer is doing something else at the time (like opening and viewing Device Manager). It should be noted that mining bitcoin using a CPU tends to be more energy intensive than mining using a discrete graphics card (CPU vs GPU mining). It also producing more heat, uses more power typically have a lower session throughput.

[2] The reward for mining sessions depends on what’s being mined, it isn’t fixed in the sense that ‘X’ outcome occurs with ‘Y’ input.

[3] Depending on the mining software, pool network and other factors, mining is typically 'sessions' based that can last from a few minutes to a number of hours. Each session has to be completed in full for the miner to have any change of earning a return (hence mining often being compared to entering or participating in a lottery - there is a high degree of chance involved, especially when working as a solo miner). This is also conditional on the results of the session being accepted. If not, and sessions are rejected, no earnings occur so all the costs incurred for that sessions, the electricity used etc., is essentially lost.

[4] Most bitcoin mining now is done in locations where electricity is cheap enough that none of the above matters, by institutions that supply their own power, or those that are heavily subsidised.

Carrier Services is having trouble with Google Play Services

September 01, 2019, 07:53:00 AM by kat

There's a persistent synchronisation error on Android devices (Android Pie [9] in particular) that prevents, interrupts or interferes with applications or services ability to automatically or self update; email apps (Outlook, Gmail, Blackberry Hub+ Inbox etc.), calendar apps, tasks etc. and other scheduled or time sensitive apps seem to be particularly affected. The error seems especially prevalent with POP (POP3) services associated with privately held domain names (non Gmail, Outlook, Yahoo etc. email).

Google Play Services
Carrier Services is having trouble with Google Play Services...

It seems that disabling access permissions for any item under Google Play services, specifically those relating to Carrier Services component, interferes with dependent apps ability to properly function. However, although further error states;

This app won't work properly unless you allow Google Plage services' request to access the follow:
● [disabled items list
● Body sensors
● Camera
● Location
● etc.]
To continue, open Settings, then Permissions, and allow all listed items.
Unfortunately even with all items enabled as requested (which may cause privacy concerns), the error still occurs intermittently suggesting there may be issues with various apps and services connecting and/or using Android 9 rather than there necessarily being an access or permissions issue.

The upshot: there doesn't appear to be a single fix for the email synchronisation issue, which may require Carrier services options and/or Account Sync to be enabled/disabled occasionally to kick the system into action.

Delete *.asf files

April 11, 2019, 12:10:10 PM by kat
ASF files are data containers a little like MP3 files, that can contain video and audio information but for some reason Windows, particularly Windows 7, has trouble deleting or moving them once created (typically this might be done converting or exporting a video in another format to *.asf using VLC or similar software), which can cause Windows Explorer, or the PC being used, to hang or become unresponsive.

To delete *.asf files try the following;

1) open the Command prompt with administrative privileges (right click cmd.exe and select "Run as administrator").

2) browse to the containing folder, either 'cd' up ('cd..' down) the directory structure;
Code: [Select]
C:\>cd social
C:\social>cd IMVU
C:\social\IMVU>cd furniture

or 'cd' the full system path to containing folder;
Code: [Select]
C:\>cd social\IMVU\furniture

3) type; "del /f [filename.asf]" (where 'filename.asf' is the name of the asf file to be deleted - exclude '[' & ']') then hit Enter;
Code: [Select]
C:\social\IMVU\furniture>del /f filename.asf
If the system appears to hang open Task Manager (right-click the Task Bar, select "Task Manager") and see if "dllhost.exe" is running (the exe is invoked into high usage state when dealing with .asf files and the removal process). If it is (it should be using high resources), select it and click the "End Process" button bottom-right. This kills the process and frees the system to delete the file (it should be deleted outright, bypassing the Recycle Bin).

4) (optional) reboot to make sure the processes that should be running, are.

The above avoids the need to download and install other unknown applications and software that may themselves present issues.

The best computer for 3D rendering

May 06, 2018, 04:27:11 AM by kat

You've likely heard it before, lurk 3D communities long enough and someone is bound to ask “what’s the best desktop for 3D rendering”, "what's the best laptop..." and so on. Well, assuming the question isn’t being asked for reasons of idle curiosity, as the saying goes "if you have to ask, you can’t afford"[1].


Flippancy aside, if you’re specifically asking "what's the best desktop for 3D rendering" and accompanying the question with a list consumer rated components and parts, you are fundamentally misunderstanding the problem because, frankly, the best desktop (or consumer-grade 'whatever') for 3D rendering are is not going to be your typical desktop, workstation or thingamajig, but machines built specifically for the task (often as rack units), rendering 3D content in similar fashion to ASIC 'computers' mining bitcoin and only bitcoin - render 'blades' are set-up and managed just to produce rendered output and little else. Not especially cheap, user-friendly or with a great ROI for the consumer compared to the common-garden hardware the end-user might be more familiar with.

So, if you’re serious about wanting to know what the best computer for rendering is because you’re considering an upgrade, what might be better to ask is; "what’s the best desktop for 3D rendering in the $600-$1500 price range", or "the best desktop under $1000 for 3D" or some such, that way you set a practical (pragmatic) budget limit on the question that helps others answer more appropriately instead of what normally happens, they shoot for the moon and suggest buying a system stacked with multiple bleeding-edge graphics cards, CPU’s and huge amounts of ram (because that's what they have!)[2].

If you want a sensible answer, ask a sensible question.

Tips for buying 'the best' computer for 3D rendering
With all that said, depending on your circumstances, the older your current hardware the more options you have for buying better (not necessarily "best") hardware for 3D than you currently have, especially if you’re on a budget. If, for example, you’re still using a second (2xxx) or third (3xxx) generation Intel processor or equivalent AMD, upgrade costs can be kept down looking at used or refurbished desktops, laptops or workstations, hardware one or two generations down the family tree (childrens children), 4th (4xxx, e.g. i7-4790) and 6th (6xxx, e.g. i5-6400) in particular; as newer generation CPU’s become available older gear is often sold complete and ready to run much more cheaply than for the cost of a low-end new machine. The same holds true of GPU and graphics cards.

Obviously, this upgrading in this way does mean doing your homework and perhaps watching sites where end-users sell their old desktop and computer components for a reasonable price and in good condition.

With this in mine research CPU and GPU benchmark sites because many commenters responding to the "best…" question often incorrectly suggest you should just get the latest generation hardware because "it beats old generation processors/graphics hands down". Not true, at least not always, or where it is, a marginal 10 or 15% increase over five generations hardly qualifies as "beats hands down", certainly not for the additional expense involved.

Where possible stick with branded hardware as they are less likely to have issues with parts and compatibility (hardware age notwithstanding; the older something is the more difficult is might be to source spare and replacement parts).

Within the budget you’ve allowed yourself try and max out on RAM; if the system being looked at has a maximum allowance of 16GB, try and source that (in addition to, or to replace, what may already be in the system when purchased). Similarly, try opting for an SSD for the OS (256GB min for Windows 10) and at least 7k rpm hard-drives.

Being frugal, and patient, it’s possible to build a workable solution for 3D generally not just rendering, for few hundred dollars that will out perform a new computer in the same ball-park.

[1] quote apparently attributed to J. P. Morgan.

[2] this is the digital artist, developer or computer users equivalent to red-Ferrari compensatory activity.

MXM graphics card upgrade (heatsink)

February 22, 2018, 08:17:33 AM by kat
The process below relates to making a simple custom heatsink for an ATI/AMD 6700M MXM graphics card (an AMD 5950M recognised by Windows 10 as an 6700M) with limited tools, to fit inside a HP Elitedesk 800 G1 USDT (Ultra-Small/Slim Desk Top) computer (system uses the same/similar MXM module configuration typical of mobile workstation or laptop video card upgrades - note also, MXM type II graphics cards only fit MXM mounting slots, they cannot be fitted to internal/laptop pci express slots). Notwithstanding the heat-sink itself, to actually run an MXM graphics card a 180W external power adapter was needed (Part No. #613766-001 or alternatively #613766-002) as the original 135W results in a POST error relating to power shortages (not enough).

Parts used/needed;
- AMD/nVidia Type III MXM card[1].
- Copper or aluminium plate or sheet[2].
- 40 x 40 x 30mm aluminium heatsink[3].
- 3M double-sided thermal adhesive pad/tape[4].
- CPU/GPU thermal pad/s (silicone)[5].

- Machine screws/bolts; M2 (2mm x 8mm) and M1.6 (1.6mm x 5mm)[6].

Tools used for the job
- junior hacksaw with metal blade.
- standard bastard file for metal.
- needle files (round).
- 2mm drill bit for metal.
- pin-vice.
- wet-n-dry fine grit.
- craft knife.
- scissors.

Making the heat-sink
To keep the process as simple as possible, the plan is to mount the aluminium heatsink square on a section of plate that’s cut to size and drilled so it can be mounted to the MXM module mounting posts on the motherboard (M1.6 screws). To avoid waste and keep the amount of work to a minimum, the baseplate will first be draw to size on a sheet of paper or thin card. This will then be cut out, placed on the metal sheet, which will be marked and cut based on this template, mounting holes included.

The basic MXM heatsink with aluminium block and copper baseplate

First mark the mounting holes to determine base-plate actual size.
The simplest way to do this is use the MXM graphics card mounting bracket on the underside of the board (if the MXM board has no mounting bracket use the holes the bracket will attach to). Hold a piece of paper over the bracket (board underside) and poke holes where the mounts are. Double-check position and alignment (cf. #1 below).

Holes puched in paper to double-check measurements for baseplate

With holes punched, the distance between them should be;

 - 46mm centre-to-centre

Using at minimum a 2mm drill-bit to match the M2 mounting bolts/screws, this makes the inside edge-to-edge measurement 44mm (or 43mm nominally), with an outside edge-to-edge of 48mm[7] (or 49mm nominally) (cf. below).

Basic measurements for the copper baseplate - 46mm centre-to-centre, 56x56mm

Knowing MXM specific mounting hole size (not the same as typically expected for ATI/AMD graphics cards) and placement the heatsink plate can be drawn relative to the MXM graphics cards overall size and the GPU’s position on the board[8].
Using 2mm thick copper or aluminium plate[9] (1.2mm minimum) and the centre-to-centre mounting hole distance of 46mm, add another 5mm hole-centre to outside edge, making the plate 56mm x 56mm[10] overall (5mm from hole-centre to outside plate edge). This forms the template and should look similar to the image below;

Paper and card templates used to draw/scribe copper baseplate

Once drawn, the MXM heatsink template can be cut out and transferred.
Double-checks measurements after initial layout then cut paper or card template using a craft knife and straightedge or steel ruler. Place on copper or aluminium plate and mark or scribe the baseplates outline and centre-punch the mounting holes [11] (spray-glue may help here).

The basic heatsink baseplate marked on 2mm thick copper plate

Cut to shape and drill mounting holes.
Using a metal cutting saw cut as close to the outside edge of the heatsink as possible (the outer border)[12] to minimise the amount of excess material that needs to be removed. Once cut, confirm the mounting hole centres are clearly punched, drill using a 2mm or 3mm drill-bit for metal[13]. Finish up using a metal file to finalise the shape, remove any heavily scribed lines or marks on the upper surface with wet-n-dry sanding paper or other abrasive.

Cooper (2mm) baseplate scribed, drilled and sized

Clean, de-oxidise and de-grease surfaces.
To make sure the heat-resistant double-sided tape sticks the aluminium heat-sink block and baseplate firmly together clean and de-grease using surface cleaners and/or alcohol[14] – this is critical for lasting adhesion. Apply the tape to the underside of the heatsink block – cut to shape/size and/or trim excess where needed. Centre the block over the plate and press down firmly[15]. The MXM heatsink unit is now ready to install.

The basic MXM cooling unit with aluminium heatsink block and copper baseplate

Installing the custom MXM heat-sink.
The final step is to install the heat-sink unit to the MXM graphics card module. Apply thermal grease to the GPU, alternatively use a silicone thermal pad. Position the heat-sink and fasten using standard M2 bolt/screws – although pressure ensures a tight fit between GPU chip and heat-sink be mindful of gaps that may form when fastening pressure is unevenly applied[16].

The custom made MXM heatsink installed in a HP 800 G1 USDT
Hardware properties of AMD Catalyst in Windows 10

Does the MXM heatsink work?
In a word, yes. Performace boost will differ depending on the module installed but they should generally be greater than embedded GPU chipsets. Unless installing an MXM card to run multiple monitors, the addition of an MXM graphics card does mean the system had two effective graphics units, or rather two GPU's, the MXM module and embedded Intel-based chipset that came with the system. This may cause conflicts (power issues notwithstanding), which can generally be solved disabling the embedded GPU in faviour of the unit on the MXM card.

Why make a heatsink?
Wouldn't it be cheaper to just buy a heatsink?. Ordinarily yes, if MXM graphics cards used standard fittings. As they don't nothing off-the-shelf fits; either mounting holes are too far apart or too close together, often by a millimetre or two, or stock heatsinks are too large to fit inside the confines a the USDT case/format, a similar issue as might be found in some server rack units where custom heatsinks need to be made to accommodate and cool server graphics cards.

[1] although "Type III" MXM graphics cards may physically fit the available MXM motherboard slot, they may not be hardware or Operating system compatible, a condition that might not be discovered until booting up.

[2] metal plate or sheet material for baseplate should be a minimum thickness of 1.2mm to limit distortion and flexing – thicker material can be used but will typically affect ease of production.

[3] heatsink dimensions are largely determined by the height from GPU to underside of the case lid, and reduced width/depth as allowed for access to mounting holes – larger prefabbed heatsinks can be used but will need altering to allow for mounting point access.

[4] thermal tape is often used to ‘stick’ heatsinks to chips mitigating mounting pins and brackets. Success depends explicitly on clean surfaces. Thermal adhesives are not the same as silicon heatsink pads that aid heat transfer between surfaces.

[5] thermal pads made from silicone should be preferred to thermal past as the spongy resistance is used to ‘tension’ the heatsink once mounted instead of springs as might normally be used.

[6] to mount the heatsink itself to the MXM bracket the same type of M2 screws/bolts used in laptops can be used. To mount the MXM card itself to the motherboard MXM mounts M1.6 screws/bolts are needed. These requirements may vary depending on motherboard and card mounting brackets or posts.

[7] as the holes related to M2 threaded bolts/screws they will need to be slightly larger to ensure the mounting bolts have wiggle room if needed to fit the mounting plate. Drilled with a 3mm bit, or a 2mm then expanded using a needle file, either/or subject to availability, this makes the inside edge-to-edge measurement between 44mm (maximum) to 45mm (minimum) – ideally 44.5mm, and/or an outside edge-to-edge of 47mm (minimum) to 48mm (maximum) – ideally 47.5mm.

[8] ] GPU chip placement is not always centred within the space defined by the brocket and holes, or perpendicular to the MXM board edges.

[9] baseplate should be a minimum of 1.2mm thick to minimise flexing when fastened to the MXM mounting bracket.

[10] the size described here is based on defining an area that allows enough room to fully support the mounting holes without undue bending or twisting of the plate (depending on plate thickness and tempering) – the heatsink baseplate could be made large or smaller depending on the space available and/or whether partially or fully covered other onboard chips and modules is possible (they don’t obstruct the baseplate).

[11] it will be easier to mark or scribe around a card version of the template using an indelible pen, fine-line marker, or pointed object. If scribing, initial markings should be light so corrections can be made with relative ease.

[12] depending on the metal used for the base plate, use a powered, ‘junior’ or full-sized hacksaw with metal-cutting blade (teeth close together). To be absolutely sure of mounting hole placement, positioned the MXM card on top of the plate and marked down through the holes double-checking their position relative to those marked. Do this before cutting out the raw baseplate.

[13] drill one hole and double-check the diagonal (e.g. bottom-left to top-right) for placement and accuracy before drilling the opposite corner. To allow some wiggle room, use of a 3mm bit is recommended else holes may be too tight (alternatively a needle file can used to clean up or widen the holes). Countersink holes to de-burr.

[14] for copper plate in particular use Brasso or similar branded or off-brand, mild abrasive, surface cleaner/metal polish before clearing any residue with (isopropyl) alcohol or nail-polish remover.

[15] use a table and once positioned, apply full weight to the unit for a moment to ensure absolutely fast adhesion. Test by checking for any play or wiggle – if tape comes unstuck the surfaces would not properly cleaned and prepared.

[16] ideally fasteners should be spring loaded in that a long bolt is fastened to the mounting bracket under the MXM board which is then tensioned by the presence of compression springs. Unfortunately, these types of fittings are not readily available for MXM cards so the use of silicon thermal pads is recommended to provide adequate thermal transfer and compressive resistance to the downward pressure of the fixings that are used.