Create and array of all the dates between a start and end date with CarbonPeriod
...
$period = CarbonPeriod::create($startDate, $endDate);
foreach ($period as $date) {
$date->format('Y-m-d');
}
$dates = $period->toArray();
...
$period = CarbonPeriod::create($startDate, $endDate);
foreach ($period as $date) {
$date->format('Y-m-d');
}
$dates = $period->toArray();
To optimize the performance of a map function in JavaScript, you can do a few things:
Use the forEach method instead of map if you are not creating a new array from the iteration. forEach is faster because it does not create a new array.
Use the map method on a smaller array if possible. For example, if you have a large array and you only need to map over a subset of it, create a new array with just the subset and use map on that.
If you are using map to transform each element of the array, use a specialized method for the transformation instead of using map. For example, if you are transforming each element to a string, use the toString method instead of using map.
If you are using map to filter an array, use the filter method instead. filter is optimized for filtering and will be faster than using map for that purpose.
Use a for loop instead of map if you need to perform complex operations on each element of the array. map is optimized for simple transformations, so a for loop will be faster for more complex operations.
The challenge I encountered was accessing the Meteor.user()
in React Router's route loader
property.
Alternatively, instead of querying the server, I'm picking up the user id from localStorage. The absence of it indicates the user is logged off or never logged in. Until Meteor kicks off, no sensitive data is displayed anyway.
...
{
path: '/',
element: <App />,
errorElement: <ErrorPage />,
loader: async () => window.localStorage.getItem('Meteor.userId'),
}
...
and in app.tsx
:
const userId: UserId | null = useLoaderData()
const navigate = useNavigate()
useEffect(() => {
if (userId === null) navigate('/login')
}, [userId])
Node's mongo driver appends the returned _id
field to the original object reference.
myCollection.insert(myObject, () => {
const {_id } = myObject
return _id;
})
unifi-os shell
curl -L https://github.com/unifi-utilities/unifios-utilities/raw/main/on-boot-script/packages/udm-boot_1.0.5_all.deb -o udm-boot.deb
dpkg -i udm-boot.deb
rm udm-boot.deb
exit
cd /mnt/data/on_boot.d
vi 15-add-root-ssh-key.sh
#!/bin/sh
#####################################################
# ADD RSA KEYS AS BELOW - CHANGE BEFORE RUNNING #
#####################################################
# set -- "ssh-rsa first key here all keys quoted" \ #
# "ssh-rsa each line appended with slash " \ #
# "ssh-rsa last one has no backslash" #
#####################################################
set -- "ssh-rsa ..." \
"ssh-rsa ...."
KEYS_FILE="/root/.ssh/authorized_keys"
counter=0
for key in "$@"
do
# Places public key in ~/.ssh/authorized_keys if not present
if ! grep -Fxq "$key" "$KEYS_FILE"; then
let counter++
echo "$key" >> "$KEYS_FILE"
fi
done
echo $counter keys added to $KEYS_FILE
chmod +x 15-add-root-ssh-key.sh
./15-add-root-ssh-key.sh
cat /dev/null > /issue
cat /dev/null > /etc/issue
cat /dev/null > /etc/motd
vi /etc/motd
# Insert your own banner
UDM uses dropbear as ssh server and therefore the configuration is done on init.
dropbear
configuration filevi /etc/default/dropbear
// See https://wiki.gentoo.org/wiki/Dropbear
DROPBEAR_OPTS="-sg"
/etc/init.d/dropbear restart
When using Typescript and Firestore, we usually have to do a lot of manual casting when working with documents. One such example would be getting the data of a document:
const thread = threadDocument.data(); // this will be of type any
Should we want to interact with the data in a type-safe manner, we'll have to cast it, which can quickly become tedious.
const thread = <ThreadData>threadDocument.data();
Additionally, when we write data to Firestore, there are no restrictions on how the data should look.
This is when Firestore Data Converters can come in handy. All we have to do is implement two methods - one where we constrain the data that gets written and one where we cast the data coming from Firestore:
const converter = {
toFirestore: (dataToBeWritten: ThreadData) => data,
fromFirestore: (document: QueryDocumentSnapshot) => <ThreadData>document.data(),
};
To take this one step further, we can store the "converted" collection reference so we won't have to apply the converters each time we query the collection:
const threadCollection = db.collection("threads").withConverter(converter);
Now we can safely interact with the collection without having to cast the data:
const threadDocument = await threadCollection.doc(id).get();
const thread = threadDocument.data(); // this will be of type ThreadData
This is how we can obtain reactivity in our custom React.js hooks while working with the local storage, using the Pub/Sub (Observer) design pattern (with TypeScript support).
The goal is to implement a "useLocalStorage" custom hook, which will abstract away the complexity of reading from and writing to the local storage. As we know, each custom hook instantiates its own state. That is a problem in our case because when one instance of the hook updates the local storage, the state copies held by all the other hook instances will be out of sync and will never be updated.
We can solve this issue using the following idea: we can mimic a centralized shared state between our custom hook instances by delegating the responsibility of holding these in sync with the local storage to a custom "manager", the Observer object.
Our custom hook will work based on these ideas:
The observer object:
export type Listener<EventType> = (event: EventType) => void;
export type ObserverReturnType<KeyType, EventType> = {
subscribe: (entryKey: KeyType, listener: Listener<EventType>) => () => void;
publish: (entryKey: KeyType, event: EventType) => void;
};
export default function createObserver<
KeyType extends string | number | symbol,
EventType,
>(): ObserverReturnType<KeyType, EventType> {
const listeners: Record<KeyType, Listener<EventType>[]> = {} as Record<
KeyType,
Listener<EventType>[]
>;
return {
subscribe: (entryKey: KeyType, listener: Listener<EventType>) => {
if (!listeners[entryKey]) listeners[entryKey] = [];
listeners[entryKey].push(listener);
return () => {
listeners[entryKey].splice(listeners[entryKey].indexOf(listener), 1);
};
},
publish: (entryKey: KeyType, event: EventType) => {
if (!listeners[entryKey]) listeners[entryKey] = [];
listeners[entryKey].forEach((listener: Listener<EventType>) =>
listener(event),
);
},
};
}
export const LocalStorageObserver = createObserver<
LOCAL_STORAGE_KEYS,
string
>();
export const { subscribe, publish } = LocalStorageObserver;
The useLocalStorage custom hook (window checks are optional, depending on which environment this JavaScript will run on):
export function useLocalStorage<T>(key: LOCAL_STORAGE_KEYS, initialValue: T) {
const [storedValue, setStoredValue] = useState(() => {
if (typeof window === 'undefined') {
return initialValue;
}
try {
const item = window.localStorage.getItem(key);
return item ? JSON.parse(item) : initialValue;
} catch (error) {
return initialValue;
}
});
LocalStorageObserver.subscribe(key, setStoredValue);
const setValue = (value: T) => {
try {
const valueToStore =
value instanceof Function ? value(storedValue) : value;
setStoredValue(valueToStore);
LocalStorageObserver.publish(key, valueToStore);
if (typeof window !== 'undefined') {
window.localStorage.setItem(key, JSON.stringify(valueToStore));
}
} catch (error) {
console.error(error);
}
};
return [storedValue, setValue];
}
If you need to manually test a package you are developing and don't want to make a release each time you modify something, you can install it directly from your local machine.
You can achieve this by using the following commands, depending on the package manager you use:
yarn add [path-to-your-package]
npm install [path-to-your-package]
This will have the following equivalent in package.json:
...
"dependencies": {
"[package]": "file:[relative-path-to-package]",
...
},
The anatomy of a basic UI Vision task looks like this:
{
"Command": "XType",
"Target": "Hello World${KEY_ENTER}",
"Value": "",
"Description": ""
},
It can be a very simple task such as typing some text in the before-selected input control. But it can also perform advanced sequences. These sequences can involve conditional structures, and system calls. It can even execute javascript in a sandboxed context.
The two most important parameters are Value
and Target
. But sometimes its usage is not very intuitive.
Luckily, the documentation is clear and accessible. As a quick link next to the command dropdown.
Let's call a node.js process sometime in our automation flow.
To do this, make sure you have the UI Vision extra X Modules installed. If you're running this on mac, make sure to also grant the necessary permissions.
The inconsistent usage of Value and Target made me spend some time figuring out how to split the command. I've found that this works:
{
"Command": "XRun",
"Target": "/My Path/.n/bin/node",
"Value": "/My Path/automations/my_automation.js",
"Description": ""
},
The Target
is the binary we want to run, and the Value
is the parameter we want to pass to it. In our case, a javascript file.
Another mention-worthy caveat:
In the Chrome extension, you can't pass the Terminal as Target
process due to security concerns.
Using DocumentFragment does the trick because it isn't part of the active document tree structure.
This means it doesn't interact with the main document until mounted.
Hence, it doesn't impact the performance when changes occur to its contents.
// Setup
const outlet = document.getElementById('#myList');
const fragment = new DocumentFragment();
const results = fetchResults(); // returns Array<Result>
We then go through each result and prepare it properly:
results.map((result, index) => {
const li = document.createElement('li');
li.textContent = result;
li.classList.add('result');
// We append to the DocumentFragment
fragment.appendChild(li);
});
Finally, we append the end result to the list:
outlet.appendChild(fragment);
If we were to append each item to the list outlet within the map
above, it would have triggered a repaint/reflow every time we performed the action.