Add conkys and wallpapers
52
.conky/arch/.conkyrc-arch
Executable file
@@ -0,0 +1,52 @@
|
|||||||
|
conky.config = {
|
||||||
|
use_spacer='none',
|
||||||
|
use_xft=true,
|
||||||
|
font='DejaVu Sans:size=9',
|
||||||
|
text_buffer_size=2048,
|
||||||
|
update_interval=600.0,
|
||||||
|
total_run_times=0,
|
||||||
|
|
||||||
|
own_window=true,
|
||||||
|
own_window_transparent=true,
|
||||||
|
own_window_type='normal',
|
||||||
|
own_window_hints='undecorated,skip_taskbar,skip_pager',
|
||||||
|
own_window_class='Conky-arch',
|
||||||
|
own_window_argb_visual=true,
|
||||||
|
own_window_argb_value=0,
|
||||||
|
|
||||||
|
draw_shades=false,
|
||||||
|
draw_outline=false,
|
||||||
|
draw_borders=false,
|
||||||
|
stippled_borders=0,
|
||||||
|
double_buffer=true,
|
||||||
|
|
||||||
|
default_color='white',
|
||||||
|
default_shade_color='black',
|
||||||
|
--Minimum size of text area
|
||||||
|
maximum_width=1200 ,
|
||||||
|
minimum_width=1200 ,
|
||||||
|
|
||||||
|
--alignment=top_right,
|
||||||
|
--gap_x=20,
|
||||||
|
--gap_y=20,
|
||||||
|
|
||||||
|
no_buffers=true,
|
||||||
|
net_avg_samples=2,
|
||||||
|
|
||||||
|
override_utf8_locale=true,
|
||||||
|
|
||||||
|
use_spacer=none,
|
||||||
|
|
||||||
|
short_units=on,
|
||||||
|
|
||||||
|
color1 = '0099ff', -- Arch News
|
||||||
|
color2 = '0000ff', --Titles
|
||||||
|
color3 = '3a3a3a', --Dates
|
||||||
|
color4 = 'dddddd', --Code tag
|
||||||
|
default_outline_color='black',--'00ccee',
|
||||||
|
lua_load = '~/.conky/arch/lua.lua'
|
||||||
|
};
|
||||||
|
conky.text = [[
|
||||||
|
${color1}${font DejaVu Sans:normal:size=24}${alignc}Arch-news
|
||||||
|
${lua_parse conky_print}
|
||||||
|
]];
|
||||||
84
.conky/arch/cache
Executable file
@@ -0,0 +1,84 @@
|
|||||||
|
${alignr}${font DejaVu Sans:size=8}Retrieved: 09/14/2018
|
||||||
|
${color2}${font DejaVu Sans:size=12}libutf8proc>=2.1.1-3 update requires manual intervention
|
||||||
|
${color}${font}The libutf8proc package prior to version 2.1.1-3 had an incorrect soname link. This has been fixed in 2.1.1-3, so the upgrade will need to overwrite the untracked soname link created by ldconfig. If
|
||||||
|
you get an error
|
||||||
|
|
||||||
|
${color4}${font DejaVu Sans:italic:size=9}libutf8proc: /usr/lib/libutf8proc.so.2 exists in filesystem${color}${font}
|
||||||
|
|
||||||
|
when updating, use
|
||||||
|
|
||||||
|
${color4}${font DejaVu Sans:italic:size=9}pacman -Suy --overwrite usr/lib/libutf8proc.so.2${color}${font}
|
||||||
|
|
||||||
|
to perform the upgrade.${color3}${font DejaVu Sans:size=8}${alignr} Uppdated: 07/14/2018
|
||||||
|
|
||||||
|
${color2}${font DejaVu Sans:size=12}js52 52.7.3-2 upgrade requires intervention
|
||||||
|
${color}${font}Due to the SONAME of ${color4}${font DejaVu Sans:italic:size=9}/usr/lib/libmozjs-52.so${color}${font} not matching its file name, ldconfig created an untracked file ${color4}${font DejaVu Sans:italic:size=9}/usr/lib/libmozjs-52.so.0${color}${font}. This is now fixed and both files are present in the package.
|
||||||
|
|
||||||
|
To pass the upgrade, remove ${color4}${font DejaVu Sans:italic:size=9}/usr/lib/libmozjs-52.so.0${color}${font} prior to upgrading.${color3}${font DejaVu Sans:size=8}${alignr} Uppdated: 05/04/2018
|
||||||
|
|
||||||
|
${color2}${font DejaVu Sans:size=12}glibc 2.27-2 and pam 1.3.0-2 may require manual intervention
|
||||||
|
${color}${font}The new version of glibc removes support for NIS and NIS+. The default ${color4}${font DejaVu Sans:italic:size=9}/etc/nsswitch.conf${color}${font} file provided by ${color4}${font DejaVu Sans:italic:size=9}filesystem${color}${font} package already reflects this change. Please make sure to merge pacnew file if it
|
||||||
|
exists prior to upgrade.
|
||||||
|
|
||||||
|
NIS functionality can still be enabled by installing ${color4}${font DejaVu Sans:italic:size=9}libnss_nis${color}${font} package. There is no replacement for NIS+ in the official repositories.
|
||||||
|
|
||||||
|
${color4}${font DejaVu Sans:italic:size=9}pam 1.3.0-2${color}${font} no longer ships pam_unix2 module and ${color4}${font DejaVu Sans:italic:size=9}pam_unix_*.so${color}${font} compatibility symlinks. Before upgrading, review PAM configuration files in the ${color4}${font DejaVu Sans:italic:size=9}/etc/pam.d${color}${font} directory and replace removed modules with ${color4}${font DejaVu Sans:italic:size=9}pam
|
||||||
|
_unix.so${color}${font}. Users of pam_unix2 should also reset their passwords after such change. Defaults provided by ${color4}${font DejaVu Sans:italic:size=9}pambase${color}${font} package do not need any modifications.${color3}${font DejaVu Sans:size=8}${alignr} Uppdated: 04/20/2018
|
||||||
|
|
||||||
|
${color2}${font DejaVu Sans:size=12}zita-resampler 1.6.0-1 -> 2 update requires manual intervention
|
||||||
|
${color}${font}The zita-resampler 1.6.0-1 package was missing a library symlink that has been readded in 1.6.0-2. If you installed 1.6.0-1, ldconfig would have created this symlink at install time, and it will
|
||||||
|
conflict with the one included in 1.6.0-2. In that case, remove /usr/lib/libzita-resampler.so.1 manually before updating.${color3}${font DejaVu Sans:size=8}${alignr} Uppdated: 02/22/2018
|
||||||
|
|
||||||
|
${color2}${font DejaVu Sans:size=12} The end of i686 support
|
||||||
|
${color}${font}Following 9 months of deprecation period${color}${font}, support for the i686 architecture effectively ends today. By the end of November, i686 packages will be removed from our mirrors and later from the packages
|
||||||
|
archive. The [multilib] repository is not affected.
|
||||||
|
|
||||||
|
For users unable to upgrade their hardware to x86_64, an alternative is a community maintained fork named Arch Linux 32${color}${font}. See their website for details on migrating existing installations.${color3}${font DejaVu Sans:size=8}${alignr} Uppdated: 11/08/2017
|
||||||
|
|
||||||
|
${color2}${font DejaVu Sans:size=12}Perl library path change
|
||||||
|
${color}${font}The perl package now uses a versioned path for compiled modules. This means that modules built for a non-matching perl version will not be loaded any more and must be rebuilt.
|
||||||
|
|
||||||
|
A pacman hook warns about affected modules during the upgrade by showing output like this:
|
||||||
|
|
||||||
|
You must rebuild all affected packages against the new perl package before you can use them again. The change also affects modules installed directly via CPAN. Rebuilding will also be necessary again
|
||||||
|
with future major perl updates like 5.28 and 5.30.
|
||||||
|
|
||||||
|
Please note that rebuilding was already required for major updates prior to this change, however now perl will no longer try to load the modules and then fail in strange ways.
|
||||||
|
|
||||||
|
If the build system of some software does not detect the change automatically, you can use ${color4}${font DejaVu Sans:italic:size=9}perl -V:vendorarch${color}${font} in your PKGBUILD to query perl for the correct path. There is also ${color4}${font DejaVu Sans:italic:size=9}sitearch${color}${font} for software
|
||||||
|
that is not packaged with pacman.${color3}${font DejaVu Sans:size=8}${alignr} Uppdated: 09/02/2017
|
||||||
|
|
||||||
|
${color2}${font DejaVu Sans:size=12}Deprecation of ABS tool and rsync endpoint
|
||||||
|
${color}${font}Due to high maintenance cost of scripts related to the Arch Build System, we have decided to deprecate the ${color4}${font DejaVu Sans:italic:size=9}abs${color}${font} tool and thus rsync as a way of obtaining PKGBUILDs.
|
||||||
|
|
||||||
|
The ${color4}${font DejaVu Sans:italic:size=9}asp${color}${font} tool, available in [extra], provides similar functionality to ${color4}${font DejaVu Sans:italic:size=9}abs${color}${font}. ${color4}${font DejaVu Sans:italic:size=9}asp export pkgname${color}${font} can be used as direct alternative; more information about its usage can be found in the documentation${color}${font}.
|
||||||
|
Additionally Subversion sparse checkouts, as described here${color}${font}, can be used to achieve a similar effect. For fetching all PKGBUILDs, the best way is cloning the svntogit${color}${font} mirrors.
|
||||||
|
|
||||||
|
While the ${color4}${font DejaVu Sans:italic:size=9}extra/abs${color}${font} package has been already dropped, the rsync endpoint (rsync://rsync.archlinux.org/abs) will be disabled by the end of the month.${color3}${font DejaVu Sans:size=8}${alignr} Uppdated: 05/15/2017
|
||||||
|
|
||||||
|
${color2}${font DejaVu Sans:size=12}ca-certificates-utils 20170307-1 upgrade requires manual intervention
|
||||||
|
${color}${font}The upgrade to <strong>ca-certificates-utils 20170307-1${color}${font} requires manual intervention because a symlink which used to be generated post-install has been moved into the package proper.
|
||||||
|
|
||||||
|
As deleting the symlink may leave you unable to download packages, perform this upgrade in three steps:${color3}${font DejaVu Sans:size=8}${alignr} Uppdated: 03/15/2017
|
||||||
|
|
||||||
|
${color2}${font DejaVu Sans:size=12}mesa with libglvnd support is now in testing
|
||||||
|
${color}${font}${color4}${font DejaVu Sans:italic:size=9}mesa${color}${font}-17.0.0-3 can now be installed side-by-side with ${color4}${font DejaVu Sans:italic:size=9}nvidia${color}${font}-378.13 driver without any libgl/libglx hacks, and with the help of Fedora and upstream xorg-server patches.
|
||||||
|
|
||||||
|
First step was to remove the libglx symlinks with xorg-server-1.19.1-3 and associated mesa/nvidia drivers through the removal of various libgl packages. It was a tough moment because it was breaking
|
||||||
|
optimus system, ${color4}${font DejaVu Sans:italic:size=9}xorg-server${color}${font} configuration needs manual updating.
|
||||||
|
|
||||||
|
The second step is now here, with an updated 10-nvidia-drm-outputclass.conf${color}${font} file that should help to have an "out-of-the-box" working ${color4}${font DejaVu Sans:italic:size=9}xorg-server${color}${font} experience with optimus system.
|
||||||
|
|
||||||
|
Please test this extensively and post your feedback in this forum thread${color}${font} or in our bugtracker${color}${font}.${color3}${font DejaVu Sans:size=8}${alignr} Uppdated: 02/27/2017
|
||||||
|
|
||||||
|
${color2}${font DejaVu Sans:size=12}Phasing out i686 support
|
||||||
|
${color}${font}Due to the decreasing popularity of i686 among the developers and the community, we have decided to phase out the support of this architecture.
|
||||||
|
|
||||||
|
The decision means that February ISO will be the last that allows to install 32 bit Arch Linux. The next 9 months are deprecation period, during which i686 will be still receiving upgraded packages.
|
||||||
|
Starting from November 2017, packaging and repository tools will no longer require that from maintainers, effectively making i686 unsupported.
|
||||||
|
|
||||||
|
However, as there is still some interest in keeping i686 alive, we would like to encourage the community to make it happen with our guidance. The arch-ports${color}${font} mailing list and #archlinux-ports IRC
|
||||||
|
channel on Freenode will be used for further coordination.
|
||||||
|
|
||||||
|
The [multilib] repository will not be affected by this change.${color3}${font DejaVu Sans:size=8}${alignr} Uppdated: 01/25/2017
|
||||||
|
|
||||||
67
.conky/arch/fetch_feed.sh
Executable file
@@ -0,0 +1,67 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Written by Peter Garceau
|
||||||
|
# Based on the RDF Feed Display Script by Hellf[i]re v0.1
|
||||||
|
#
|
||||||
|
# This script is designed for the Arch Linux News Feed.
|
||||||
|
#
|
||||||
|
# This script depends on curl.
|
||||||
|
# pacman -Sy curl
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# .conkyrc: ${execi [time] /path/to/script/conky-rss.sh}
|
||||||
|
#
|
||||||
|
# Usage Example
|
||||||
|
# ${execi 300 /home/youruser/scripts/conky-rss.sh}
|
||||||
|
|
||||||
|
#Try to download feed
|
||||||
|
RES=$(curl -fs https://www.archlinux.org/feeds/news/)
|
||||||
|
SUCCESS=$?
|
||||||
|
if [ $SUCCESS == 0 ]
|
||||||
|
then
|
||||||
|
echo $RES > ~/.conky/arch/cache
|
||||||
|
fi
|
||||||
|
|
||||||
|
#RSS Setup
|
||||||
|
LINES=4 #Number of headlines
|
||||||
|
|
||||||
|
FEED=$(cat ~/.conky/arch/cache)
|
||||||
|
|
||||||
|
if [ "$FEED" ]
|
||||||
|
then
|
||||||
|
IFS=$'\n'
|
||||||
|
TITLES=($(echo $FEED |
|
||||||
|
sed -e 's/></>\n</g' |
|
||||||
|
sed -n '/<title>/p'))
|
||||||
|
|
||||||
|
ITEMS=($(echo $FEED |
|
||||||
|
sed -e 's/<item>/\n<item>/g'))
|
||||||
|
|
||||||
|
unset IFS
|
||||||
|
|
||||||
|
OUTPUT='1'
|
||||||
|
i=1
|
||||||
|
while [ $i -le $LINES ]
|
||||||
|
do
|
||||||
|
OUTPUT+='\n'
|
||||||
|
OUTPUT+=$(echo ${TITLES[$i]} | sed -n '/<title>/p' | \
|
||||||
|
sed -e 's/<title>//' | sed -e 's/<\/title>//')
|
||||||
|
OUTPUT+='\n'
|
||||||
|
OUTPUT+=$(echo ${ITEMS[$i]} | sed -e 's/.*<pubDate>\(.*\)<\/pubDate>.*/\1/' | \
|
||||||
|
awk '{print $1 " " $2 " " $3}')
|
||||||
|
OUTPUT+='\n'
|
||||||
|
OUTPUT+=$(echo ${ITEMS[$i]} | sed -e 's/.*<description>\(.*\)<\/description>.*/\1/' | \
|
||||||
|
sed -e 's/<\/p> <p>/\n/g' | sed -e 's/<\/p>//g' | \
|
||||||
|
sed -e 's/<p>//g')
|
||||||
|
|
||||||
|
i=$(($i + 1))
|
||||||
|
done
|
||||||
|
echo -e $OUTPUT
|
||||||
|
echo -e $OUTPUT | sed -e 's/</</g' | sed -e 's/>/>/g' | \
|
||||||
|
sed -e 's/<ul>/\\n/g' | sed -e 's/<\/ul>/\\n/g' | sed -e 's/<li>/\\n• /g' | sed -e 's/<\/li>//g' | \
|
||||||
|
sed -e 's/<a href=[^>]*>/\\3/g' | sed -e 's/<\/a>/\\d/g' | \
|
||||||
|
sed -e 's/<strong>/\\b/g' | sed -e 's/<\/strong>/\\r/g' | \
|
||||||
|
sed -e 's/<pre>//g' | sed -e 's/<\/pre>//g' | \
|
||||||
|
sed -e 's/<code>/\\i/g' | sed -e 's/<\/code>/\\r/g' \
|
||||||
|
> ~/.conky/arch/feed
|
||||||
|
fi
|
||||||
|
#Do not update anything if curl fails
|
||||||
174
.conky/arch/lua.lua
Normal file
@@ -0,0 +1,174 @@
|
|||||||
|
line_width = 200
|
||||||
|
lipsum = [[
|
||||||
|
|
||||||
|
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Curabitur eget dolor a dui interdum rhoncus. Aenean congue nunc quis sem ultrices, vel fringilla tellus dignissim. Vivamus vitae purus ligula. Quisque ante elit, ultrices id aliquam a, porttitor quis risus. Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos himenaeos. Nunc elementum nisl nec efficitur mattis. Donec viverra tempor enim nec dictum. Suspendisse quam neque, posuere eu magna sit amet, rhoncus finibus risus. Phasellus tristique ac lacus eu ultricies. Quisque varius purus at eros rutrum hendrerit. Donec efficitur justo eu scelerisque mollis. Fusce dictum aliquam convallis. Phasellus eget nunc lacus. Nunc urna dui, tempor pharetra consectetur non, elementum sed purus. Phasellus consectetur quis libero et semper.
|
||||||
|
|
||||||
|
Pellentesque condimentum sem quis diam commodo convallis. Morbi et ligula sagittis, venenatis tortor ut, molestie dui. Nam egestas, purus eu efficitur mollis, ex neque convallis felis, at egestas nisi dolor sed enim. Pellentesque a felis facilisis orci dapibus porttitor. Aenean et viverra nulla, interdum eleifend neque. Nunc sem justo, vulputate et arcu eget, dignissim lobortis dui. Vivamus laoreet feugiat elit.
|
||||||
|
|
||||||
|
Fusce euismod nibh vitae orci gravida pretium. Quisque feugiat lacinia tortor eget sagittis. Morbi in mauris sit amet dui vehicula egestas. Curabitur sit amet facilisis lectus. Donec id turpis eleifend, hendrerit turpis eu, tristique velit. Curabitur pulvinar facilisis tincidunt. In hac habitasse platea dictumst. Donec convallis erat id neque pretium, non gravida felis imperdiet. Vivamus hendrerit, nunc id rutrum tristique, arcu leo auctor sapien, id fringilla mauris erat sed justo. Etiam pharetra quis enim sit amet imperdiet. Vestibulum condimentum massa vel ullamcorper malesuada. Duis libero eros, facilisis in suscipit in, varius id ipsum.
|
||||||
|
|
||||||
|
Donec in orci ac leo dapibus maximus sed eget tortor. Sed blandit eros lectus, et venenatis erat aliquam sed. Vestibulum risus ex, laoreet at malesuada non, egestas ut lacus. Quisque sed malesuada leo. Ut id ligula accumsan, commodo massa eget, porta mauris. Mauris a lectus ac tellus ornare aliquet quis at arcu. Curabitur nec sem sodales, mattis velit non, tempor arcu. Morbi sit amet maximus lacus, nec venenatis est. Curabitur et congue dui. Nullam lacinia augue non quam hendrerit facilisis.
|
||||||
|
|
||||||
|
Curabitur molestie mauris eget tempor mattis. Donec velit arcu, iaculis quis leo et, sodales venenatis erat. Maecenas in malesuada erat, vitae iaculis odio. Praesent id ultrices sem. In hac habitasse platea dictumst. In aliquet, nisl laoreet bibendum sodales, justo lacus aliquam mi, non elementum urna odio et nunc. Nulla interdum, ante in vestibulum pellentesque, nunc enim iaculis urna, eget convallis lectus ex sed tortor. Morbi suscipit malesuada felis at aliquam.
|
||||||
|
]]
|
||||||
|
text = [[
|
||||||
|
<p>
|
||||||
|
<H2>The string library</H2>
|
||||||
|
<p>
|
||||||
|
Lua supplies a range of useful functions for processing and manipulating strings in its standard library. More details are supplied in the <a href="/wiki/StringLibraryTutorial" >StringLibraryTutorial</a>. Below are a few examples of usage of the string library.
|
||||||
|
<DL>
|
||||||
|
<dt><dd><pre class="code">
|
||||||
|
> = <span class="library">string.byte</span>(<span class="string">"ABCDE"</span>, 2) <span class="comment">-- return the ASCII value of the second character</span>
|
||||||
|
66
|
||||||
|
> = <span class="library">string.char</span>(65,66,67,68,69) <span class="comment">-- return a string constructed from ASCII values</span>
|
||||||
|
ABCDE
|
||||||
|
> = <span class="library">string.find</span>(<span class="string">"hello Lua user"</span>, <span class="string">"Lua"</span>) <span class="comment">-- find substring "Lua"</span>
|
||||||
|
7 9
|
||||||
|
> = <span class="library">string.find</span>(<span class="string">"hello Lua user"</span>, <span class="string">"l+"</span>) <span class="comment">-- find one or more occurrences of "l"</span>
|
||||||
|
3 4
|
||||||
|
> = <span class="library">string.format</span>(<span class="string">"%.7f"</span>, <span class="library">math.pi</span>) <span class="comment">-- format a number</span>
|
||||||
|
3.1415927
|
||||||
|
> = <span class="library">string.format</span>(<span class="string">"%8s"</span>, <span class="string">"Lua"</span>) <span class="comment">-- format a string</span>
|
||||||
|
Lua
|
||||||
|
</pre>
|
||||||
|
|
||||||
|
</DL>
|
||||||
|
<p>
|
||||||
|
<H2>Coercion</H2>
|
||||||
|
<p>
|
||||||
|
Lua performs automatic conversion of numbers to strings and vice versa where it is appropriate. This is called <em>coercion</em>.
|
||||||
|
<DL>
|
||||||
|
<dt><dd><pre class="code">
|
||||||
|
> = <span class="string">"This is Lua version "</span> .. 5.1 .. <span class="string">" we are using."</span>
|
||||||
|
This is Lua version 5.1 we are using.
|
||||||
|
> = <span class="string">"Pi = "</span> .. <span class="library">math.pi</span>
|
||||||
|
Pi = 3.1415926535898
|
||||||
|
> = <span class="string">"Pi = "</span> .. 3.1415927
|
||||||
|
Pi = 3.1415927
|
||||||
|
</pre>
|
||||||
|
|
||||||
|
</DL>
|
||||||
|
As shown above, during coercion, we do not have full control over the formatting of the conversion. To format the number as a string as we would like we can use the <code>string.format()</code> function. e.g.,
|
||||||
|
<DL>
|
||||||
|
<dt><dd><pre class="code">
|
||||||
|
> = <span class="library">string.format</span>(<span class="string">"%.3f"</span>, 5.1)
|
||||||
|
5.100
|
||||||
|
> = <span class="string">"Lua version "</span> .. <span class="library">string.format</span>(<span class="string">"%.1f"</span>, 5.1)
|
||||||
|
Lua version 5.1
|
||||||
|
</pre>
|
||||||
|
|
||||||
|
]]
|
||||||
|
text = text:gsub("\n"," ")
|
||||||
|
function indent_entry(contents,init,break_at)
|
||||||
|
if contents:len() <= break_at then
|
||||||
|
return contents -- Done
|
||||||
|
end
|
||||||
|
|
||||||
|
local begin_tag,end_tag = contents:find("<(.-)>",init)
|
||||||
|
if not begin_tag or begin_tag > break_at then
|
||||||
|
-- No tag exists or next tag is behind preferred break
|
||||||
|
-- Find last space before break, if no space break word at break_at
|
||||||
|
for i=break_at,init,-1 do -- Iterate backwards over string
|
||||||
|
if contents:sub(i,i) == " " then
|
||||||
|
break_at = i
|
||||||
|
break;
|
||||||
|
end
|
||||||
|
end
|
||||||
|
contents = contents:sub(1,break_at) .. "\n" .. contents:sub(break_at+1)
|
||||||
|
init = break_at+1 -- Not necessarily same value as passed to the function
|
||||||
|
break_at = break_at + line_width
|
||||||
|
debug = "No tag or tag behind break."
|
||||||
|
elseif end_tag < break_at then
|
||||||
|
-- Next tag is before preferred break
|
||||||
|
-- Increase break_at by length of tag and iterate (init = end_tag)
|
||||||
|
init = end_tag
|
||||||
|
break_at = break_at + 1 + end_tag - begin_tag
|
||||||
|
debug = "Tag entirely before break"
|
||||||
|
else
|
||||||
|
-- Case break_at in the middle of tag:
|
||||||
|
-- Increase break at by length of tag
|
||||||
|
init = end_tag + 1
|
||||||
|
break_at = break_at + 1 + end_tag - begin_tag
|
||||||
|
debug = "Tag in the middle of break"
|
||||||
|
end
|
||||||
|
return indent_entry(contents,init,break_at)
|
||||||
|
end
|
||||||
|
function parse_paragraph(par)
|
||||||
|
return indent_entry(par:gsub("\n", " "),1,line_width)
|
||||||
|
end
|
||||||
|
function substitute_tags(content)
|
||||||
|
return content:gsub("<code>",("${color4}${font %s}"):format(font_i(9)))
|
||||||
|
:gsub("<b>",("${font_b %s}"):format(font_b(9)))
|
||||||
|
:gsub("</(.-)>","${color}${font}")
|
||||||
|
:gsub("<a(.-)>","")
|
||||||
|
end
|
||||||
|
function parse_entry(contents)
|
||||||
|
local ret = ""
|
||||||
|
for paragraph in contents:gmatch("<p>(.-)</p>") do
|
||||||
|
if ret ~= "" then
|
||||||
|
ret = ret .. "\n\n" -- Add line break on all except first entry
|
||||||
|
end
|
||||||
|
ret = ret .. parse_paragraph(paragraph)
|
||||||
|
end
|
||||||
|
return substitute_tags(ret)
|
||||||
|
end
|
||||||
|
function format_entry(entry)
|
||||||
|
local entry_contents = parse_entry(entry.summary)
|
||||||
|
local res = ('${color2}${font %s}%s\n'):format(font(12),entry.title)
|
||||||
|
res = ('%s${color}${font}%s'):format(res,entry_contents)
|
||||||
|
res = ('%s${color3}${font %s}${alignr} Uppdated: %s\n\n'):format(res,font(8),os.date("%x",entry.updated_parsed))
|
||||||
|
return res
|
||||||
|
end
|
||||||
|
function conky_print()
|
||||||
|
local f = assert(get_file_handle('rb'))
|
||||||
|
local content = f:read("*all")
|
||||||
|
f:close()
|
||||||
|
return content
|
||||||
|
end
|
||||||
|
function fetch_feed()
|
||||||
|
-- Fetch over http
|
||||||
|
local http_request = require "http.request"
|
||||||
|
local url = "https://www.archlinux.org/feeds/news/"
|
||||||
|
local headers, stream = assert(http_request.new_from_uri(url):go())
|
||||||
|
local body = assert(stream:get_body_as_string())
|
||||||
|
if headers:get ":status" ~= "200" then
|
||||||
|
error(body)
|
||||||
|
end
|
||||||
|
a,s,c = headers:geti(3)
|
||||||
|
p="%a+, (%d+) (%a+) (%d+) (%d+):(%d+):(%d+) GMT"
|
||||||
|
day,month,year,hour,min,sec=s:match(p)
|
||||||
|
MON={Jan=1,Feb=2,Mar=3,Apr=4,May=5,Jun=6,Jul=7,Aug=8,Sep=9,Oct=10,Nov=11,Dec=12}
|
||||||
|
month=MON[month]
|
||||||
|
offset=os.time()-os.time(os.date("!*t"))
|
||||||
|
retrieved = os.date("%x",os.time({day=day,month=month,year=year,hour=hour,min=min,sec=sec})+offset)
|
||||||
|
|
||||||
|
-- Parse
|
||||||
|
local feedparser = require("feedparser")
|
||||||
|
entries = feedparser.parse(body).entries
|
||||||
|
s = entries[2].summary
|
||||||
|
local res = ('${alignr}${font %s}Retrieved: %s\n'):format(font(8),retrieved)
|
||||||
|
for key,entry in pairs(entries) do
|
||||||
|
res = res .. format_entry(entry)
|
||||||
|
end
|
||||||
|
|
||||||
|
-- Save to cache
|
||||||
|
local file = get_file_handle("w")
|
||||||
|
io.output(file)
|
||||||
|
io.write(res)
|
||||||
|
io.close(file)
|
||||||
|
return parsed
|
||||||
|
end
|
||||||
|
function font(size)
|
||||||
|
return ('DejaVu Sans:size=%d'):format(size)
|
||||||
|
end
|
||||||
|
function font_b(size)
|
||||||
|
return ('DejaVu Sans:bold:size=%d'):format(size)
|
||||||
|
end
|
||||||
|
function font_i(size)
|
||||||
|
return ('DejaVu Sans:italic:size=%d'):format(size)
|
||||||
|
end
|
||||||
|
function get_file_handle(opt)
|
||||||
|
local filename = '/home/kuba/.conky/arch/cache'
|
||||||
|
return io.open(filename, opt)
|
||||||
|
end
|
||||||
|
s = fetch_feed()
|
||||||
54
.conky/gmail/.conkyrc-gmail
Executable file
@@ -0,0 +1,54 @@
|
|||||||
|
conky.config = {
|
||||||
|
use_spacer='none',
|
||||||
|
use_xft=true,
|
||||||
|
font='DejaVu Sans:size=9',
|
||||||
|
text_buffer_size=2048,
|
||||||
|
update_interval=600,
|
||||||
|
total_run_times=0,
|
||||||
|
|
||||||
|
own_window=true,
|
||||||
|
own_window_transparent=true,
|
||||||
|
own_window_type='normal',
|
||||||
|
own_window_hints='undecorated,skip_taskbar,skip_pager',
|
||||||
|
own_window_class='Conky-gmail',
|
||||||
|
own_window_argb_visual=true,
|
||||||
|
own_window_argb_value=0,
|
||||||
|
|
||||||
|
draw_shades=false,
|
||||||
|
draw_outline=false,
|
||||||
|
draw_borders=false,
|
||||||
|
stippled_borders=0,
|
||||||
|
double_buffer=true,
|
||||||
|
|
||||||
|
default_color='white',
|
||||||
|
default_shade_color='black',
|
||||||
|
--Minimum size of text area
|
||||||
|
maximum_width=1200 ,
|
||||||
|
minimum_width=1200 ,
|
||||||
|
|
||||||
|
--alignment=top_right,
|
||||||
|
--gap_x=20,
|
||||||
|
--gap_y=20,
|
||||||
|
|
||||||
|
no_buffers=true,
|
||||||
|
net_avg_samples=2,
|
||||||
|
|
||||||
|
override_utf8_locale=true,
|
||||||
|
|
||||||
|
use_spacer=none,
|
||||||
|
|
||||||
|
short_units=on,
|
||||||
|
|
||||||
|
default_color=000,
|
||||||
|
default_shade_color=black,
|
||||||
|
color1 = '0099ff',
|
||||||
|
color2 = '0000ff',
|
||||||
|
color3 = '3a3a3a',
|
||||||
|
color4 = 'dddddd',
|
||||||
|
default_outline_color='2edd2e'
|
||||||
|
};
|
||||||
|
conky.text = [[
|
||||||
|
${alignc}${font FontAwesome:size=24}${font Liberation Sans:size=24}mail${font}
|
||||||
|
${color7}${hr}${color}
|
||||||
|
# ${execp ~/.conky/gmail/gmail_imap.py}
|
||||||
|
]];
|
||||||
116
.conky/gmail/gmail_imap.py
Executable file
@@ -0,0 +1,116 @@
|
|||||||
|
#! /usr/bin/env python3
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import imaplib
|
||||||
|
import email
|
||||||
|
import email.header
|
||||||
|
import datetime
|
||||||
|
from textwrap import wrap # For pretty printing assistance
|
||||||
|
import json
|
||||||
|
import time
|
||||||
|
import os.path
|
||||||
|
from os.path import expanduser
|
||||||
|
|
||||||
|
WRAP_LIMIT = 80
|
||||||
|
path = "/home/kuba/.conky/gmail/last.json"
|
||||||
|
|
||||||
|
def process_mailbox(M):
|
||||||
|
rv, data = M.search(None, "ALL")
|
||||||
|
if rv != 'OK':
|
||||||
|
print ("No messages found!")
|
||||||
|
return
|
||||||
|
numbers = data[0].split()
|
||||||
|
ln = len(numbers)
|
||||||
|
timestamp = str(round(time.time()))
|
||||||
|
|
||||||
|
parsed_data = {} #Find and save last 10 emails
|
||||||
|
parsed_data['timestamp'] = timestamp
|
||||||
|
emails = []
|
||||||
|
for num in numbers[ln:ln-10:-1]:
|
||||||
|
rv, data = M.fetch(num, '(RFC822)')
|
||||||
|
if rv != 'OK':
|
||||||
|
print ("ERROR getting message"), num
|
||||||
|
return
|
||||||
|
eml = {}
|
||||||
|
|
||||||
|
msg = email.message_from_bytes(data[0][1])
|
||||||
|
eml.update({'subject' : decode_field(msg['Subject'])})
|
||||||
|
eml.update({'sender' : decode_field(msg['From'])})
|
||||||
|
date_tuple = email.utils.parsedate_tz(msg['Date'])
|
||||||
|
if date_tuple:
|
||||||
|
local_date = datetime.datetime.fromtimestamp(
|
||||||
|
email.utils.mktime_tz(date_tuple))
|
||||||
|
eml.update({'date' : local_date.strftime("%a, %d %b %Y %H:%M:%S")})
|
||||||
|
emails.append(eml)
|
||||||
|
|
||||||
|
parsed_data['emails'] = emails
|
||||||
|
with open(path, 'w') as outfile:
|
||||||
|
json.dump(parsed_data, outfile)
|
||||||
|
|
||||||
|
def fetch_mail():
|
||||||
|
M = imaplib.IMAP4_SSL('imap.gmail.com')
|
||||||
|
|
||||||
|
try:
|
||||||
|
json_data=open(expanduser('~')+'/.conky/scripts/.passwords.json')
|
||||||
|
data = json.load(json_data)
|
||||||
|
username=data['gmail']['username']
|
||||||
|
password=data['gmail']['password']
|
||||||
|
rv, data = M.login(username, password)
|
||||||
|
except imaplib.IMAP4.error:
|
||||||
|
print ("LOGIN FAILED!!! ")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
rv, data = M.select()
|
||||||
|
if rv == 'OK':
|
||||||
|
# print("Processing mailbox...\n")
|
||||||
|
process_mailbox(M)
|
||||||
|
M.close()
|
||||||
|
else:
|
||||||
|
print("ERROR: Unable to open mailbox ", rv)
|
||||||
|
|
||||||
|
M.logout()
|
||||||
|
|
||||||
|
def fill(text, width):
|
||||||
|
'''A custom method to assist in pretty printing'''
|
||||||
|
if len(text) < width:
|
||||||
|
return text + ' '*(width-len(text))
|
||||||
|
else:
|
||||||
|
return text
|
||||||
|
|
||||||
|
def decode_field(data):
|
||||||
|
decode = email.header.decode_header(data)[0]
|
||||||
|
if decode[1] == None or decode[1] == 'unknown-8bit':
|
||||||
|
return str(decode[0])
|
||||||
|
else:
|
||||||
|
return str(decode[0], decode[1])
|
||||||
|
|
||||||
|
def print_output(data):
|
||||||
|
timestamp = int(data['timestamp'])
|
||||||
|
updated = time.strftime("%m/%d %H:%M", time.gmtime(timestamp))
|
||||||
|
print ("${alignr}Updated: ${color white}%s" % updated)
|
||||||
|
|
||||||
|
for mail in data['emails']:
|
||||||
|
subject = mail['subject']
|
||||||
|
sender = mail['sender']
|
||||||
|
dte = mail['date']
|
||||||
|
if(len(subject) > WRAP_LIMIT - 3):
|
||||||
|
print ("${color1}%s..." % subject)
|
||||||
|
else:
|
||||||
|
print ("${color1}%s" % subject)
|
||||||
|
|
||||||
|
# Does file exist?
|
||||||
|
if not (os.path.isfile(path)):
|
||||||
|
fetch_mail()
|
||||||
|
with open(path) as infile:
|
||||||
|
data = json.load(infile)
|
||||||
|
|
||||||
|
# Is data recently fetched?
|
||||||
|
ts = int(data['timestamp'])
|
||||||
|
now = round(time.time())
|
||||||
|
if (ts + 3600 < now):
|
||||||
|
fetch_mail()
|
||||||
|
with open(path) as infile:
|
||||||
|
data = json.load(infile)
|
||||||
|
|
||||||
|
# Format output
|
||||||
|
print_output(data)
|
||||||
0
.conky/gmail/last.json
Normal file
147
.conky/gmail/lua.lua
Normal file
@@ -0,0 +1,147 @@
|
|||||||
|
line_width = 200
|
||||||
|
function indent_entry(contents,init,break_at)
|
||||||
|
if contents:len() <= break_at then
|
||||||
|
return contents -- Done
|
||||||
|
end
|
||||||
|
|
||||||
|
local begin_tag,end_tag = contents:find("<(.-)>",init)
|
||||||
|
if not begin_tag or begin_tag > break_at then
|
||||||
|
-- No tag exists or next tag is behind preferred break
|
||||||
|
-- Find last space before break, if no space break word at break_at
|
||||||
|
for i=break_at,init,-1 do -- Iterate backwards over string
|
||||||
|
if contents:sub(i,i) == " " then
|
||||||
|
break_at = i
|
||||||
|
break;
|
||||||
|
end
|
||||||
|
end
|
||||||
|
contents = contents:sub(1,break_at) .. "\n" .. contents:sub(break_at+1)
|
||||||
|
init = break_at+1 -- Not necessarily same value as passed to the function
|
||||||
|
break_at = break_at + line_width
|
||||||
|
debug = "No tag or tag behind break."
|
||||||
|
elseif end_tag < break_at then
|
||||||
|
-- Next tag is before preferred break
|
||||||
|
-- Increase break_at by length of tag and iterate (init = end_tag)
|
||||||
|
init = end_tag
|
||||||
|
break_at = break_at + 1 + end_tag - begin_tag
|
||||||
|
debug = "Tag entirely before break"
|
||||||
|
else
|
||||||
|
-- Case break_at in the middle of tag:
|
||||||
|
-- Increase break at by length of tag
|
||||||
|
init = end_tag + 1
|
||||||
|
break_at = break_at + 1 + end_tag - begin_tag
|
||||||
|
debug = "Tag in the middle of break"
|
||||||
|
end
|
||||||
|
return indent_entry(contents,init,break_at)
|
||||||
|
end
|
||||||
|
function parse_paragraph(par)
|
||||||
|
return indent_entry(par:gsub("\n", " "),1,line_width)
|
||||||
|
end
|
||||||
|
function substitute_tags(content)
|
||||||
|
return content:gsub("<code>",("${color4}${font %s}"):format(font_i(9)))
|
||||||
|
:gsub("<b>",("${font_b %s}"):format(font_b(9)))
|
||||||
|
:gsub("</(.-)>","${color}${font}")
|
||||||
|
:gsub("<a(.-)>","")
|
||||||
|
end
|
||||||
|
function parse_entry(contents)
|
||||||
|
local ret = ""
|
||||||
|
for paragraph in contents:gmatch("<p>(.-)</p>") do
|
||||||
|
if ret ~= "" then
|
||||||
|
ret = ret .. "\n\n" -- Add line break on all except first entry
|
||||||
|
end
|
||||||
|
ret = ret .. parse_paragraph(paragraph)
|
||||||
|
end
|
||||||
|
return substitute_tags(ret)
|
||||||
|
end
|
||||||
|
function format_entry(entry)
|
||||||
|
local entry_contents = parse_entry(entry.summary)
|
||||||
|
local res = ('${color2}${font %s}%s\n'):format(font(12),entry.title)
|
||||||
|
res = ('%s${color}${font}%s'):format(res,entry_contents)
|
||||||
|
res = ('%s${color3}${font %s}${alignr} Uppdated: %s\n\n'):format(res,font(8),os.date("%x",entry.updated_parsed))
|
||||||
|
return res
|
||||||
|
end
|
||||||
|
function conky_print()
|
||||||
|
local f = assert(get_file_handle('rb'))
|
||||||
|
local content = f:read("*all")
|
||||||
|
f:close()
|
||||||
|
return content
|
||||||
|
end
|
||||||
|
function fetch_feed()
|
||||||
|
local imap4 = require "imap4"
|
||||||
|
local Message = require "pop3.message"
|
||||||
|
|
||||||
|
local connection = imap4('imap.gmail.com', 993)
|
||||||
|
|
||||||
|
assert(connection:isCapable('IMAP4rev1'))
|
||||||
|
|
||||||
|
connection:login('****', '****')
|
||||||
|
|
||||||
|
-- Select INBOX with read only permissions.
|
||||||
|
local info = connection:examine('INBOX')
|
||||||
|
print(info.exist, info.recent)
|
||||||
|
|
||||||
|
-- List info on the 4 most recent mails.
|
||||||
|
for _,v in pairs(connection:fetch('RFC822', (info.exist-4)..':*')) do
|
||||||
|
print("-------------------------")
|
||||||
|
local msg = Message(v.RFC822)
|
||||||
|
print("ID: ", msg:id())
|
||||||
|
print("subject: ", msg:subject())
|
||||||
|
print("to: ", msg:to())
|
||||||
|
print("from: ", msg:from())
|
||||||
|
print("from addr: ", msg:from_address())
|
||||||
|
print("reply: ", msg:reply_to())
|
||||||
|
print("reply addr: ", msg:reply_address())
|
||||||
|
print("trunc: ", msg:is_truncated())
|
||||||
|
for i,v in ipairs(msg:full_content()) do
|
||||||
|
if v.text then print(" ", i , "TEXT: ", v.type, #v.text)
|
||||||
|
else print(" ", i , "FILE: ", v.type, v.file_name or v.name, #v.data) end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
-- close connection
|
||||||
|
connection:logout()
|
||||||
|
-- Fetch over http
|
||||||
|
local http_request = require "http.request"
|
||||||
|
local url = "https://www.archlinux.org/feeds/news/"
|
||||||
|
local headers, stream = assert(http_request.new_from_uri(url):go())
|
||||||
|
local body = assert(stream:get_body_as_string())
|
||||||
|
if headers:get ":status" ~= "200" then
|
||||||
|
error(body)
|
||||||
|
end
|
||||||
|
a,s,c = headers:geti(3)
|
||||||
|
p="%a+, (%d+) (%a+) (%d+) (%d+):(%d+):(%d+) GMT"
|
||||||
|
day,month,year,hour,min,sec=s:match(p)
|
||||||
|
MON={Jan=1,Feb=2,Mar=3,Apr=4,May=5,Jun=6,Jul=7,Aug=8,Sep=9,Oct=10,Nov=11,Dec=12}
|
||||||
|
month=MON[month]
|
||||||
|
offset=os.time()-os.time(os.date("!*t"))
|
||||||
|
retrieved = os.date("%x",os.time({day=day,month=month,year=year,hour=hour,min=min,sec=sec})+offset)
|
||||||
|
|
||||||
|
-- Parse
|
||||||
|
local feedparser = require("feedparser")
|
||||||
|
entries = feedparser.parse(body).entries
|
||||||
|
s = entries[2].summary
|
||||||
|
local res = ('${alignr}${font %s}Retrieved: %s\n'):format(font(8),retrieved)
|
||||||
|
for key,entry in pairs(entries) do
|
||||||
|
res = res .. format_entry(entry)
|
||||||
|
end
|
||||||
|
|
||||||
|
-- Save to cache
|
||||||
|
local file = get_file_handle("w")
|
||||||
|
io.output(file)
|
||||||
|
io.write(res)
|
||||||
|
io.close(file)
|
||||||
|
return parsed
|
||||||
|
end
|
||||||
|
function font(size)
|
||||||
|
return ('DejaVu Sans:size=%d'):format(size)
|
||||||
|
end
|
||||||
|
function font_b(size)
|
||||||
|
return ('DejaVu Sans:bold:size=%d'):format(size)
|
||||||
|
end
|
||||||
|
function font_i(size)
|
||||||
|
return ('DejaVu Sans:italic:size=%d'):format(size)
|
||||||
|
end
|
||||||
|
function get_file_handle(opt)
|
||||||
|
local filename = '/home/kuba/.conky/arch/cache'
|
||||||
|
return io.open(filename, opt)
|
||||||
|
end
|
||||||
|
s = fetch_feed()
|
||||||
54
.conky/music/.conkyrc-music
Executable file
@@ -0,0 +1,54 @@
|
|||||||
|
conky.config = {
|
||||||
|
use_spacer='none',
|
||||||
|
use_xft=true,
|
||||||
|
font='Liberation Sans:Bold:size=24',
|
||||||
|
text_buffer_size=2048,
|
||||||
|
update_interval=1.0,
|
||||||
|
total_run_times=0,
|
||||||
|
|
||||||
|
own_window=true,
|
||||||
|
own_window_transparent=true,
|
||||||
|
own_window_type='normal',
|
||||||
|
own_window_hints='undecorated,skip_taskbar,skip_pager',
|
||||||
|
own_window_class='Conky-music',
|
||||||
|
own_window_argb_visual=true,
|
||||||
|
own_window_argb_value=0,
|
||||||
|
|
||||||
|
draw_shades=false,
|
||||||
|
draw_outline=false,
|
||||||
|
draw_borders=false,
|
||||||
|
stippled_borders=0,
|
||||||
|
double_buffer=true,
|
||||||
|
draw_blended=false,
|
||||||
|
|
||||||
|
default_color='white',
|
||||||
|
default_shade_color='black',
|
||||||
|
--Minimum size of text area
|
||||||
|
maximum_width=1200 ,
|
||||||
|
|
||||||
|
--alignment='top_left',
|
||||||
|
--gap_x=1940,
|
||||||
|
--gap_y=0,
|
||||||
|
|
||||||
|
no_buffers=true,
|
||||||
|
net_avg_samples=2,
|
||||||
|
|
||||||
|
override_utf8_locale=true,
|
||||||
|
|
||||||
|
use_spacer=none,
|
||||||
|
|
||||||
|
short_units=on,
|
||||||
|
|
||||||
|
default_color='dddddd',
|
||||||
|
color1 = 'C7FF8E',
|
||||||
|
color2 = '000000',
|
||||||
|
color7 = '333333'
|
||||||
|
};
|
||||||
|
conky.text = [[
|
||||||
|
${if_existing /tmp/kuba_now_playing }
|
||||||
|
${color2}${exec cat /tmp/kuba_now_playing 2> /dev/null}${color}
|
||||||
|
${image /tmp/kuba_now_playing_cover.png -p -f 3 0,0 -s 200x200}
|
||||||
|
${else}
|
||||||
|
Not playing
|
||||||
|
${endif}
|
||||||
|
]];
|
||||||
26
.conky/music/fetch_art.sh
Executable file
@@ -0,0 +1,26 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# Written by Demetrio Ferro <ferrodemetrio@gmail.com> <https://twitter.com/DemetrioFerro>
|
||||||
|
# Distributed under license GPLv3+ GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
|
||||||
|
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY. YOU USE AT YOUR OWN RISK. THE AUTHOR
|
||||||
|
# WILL NOT BE LIABLE FOR DATA LOSS, DAMAGES, LOSS OF PROFITS OR ANY
|
||||||
|
# OTHER KIND OF LOSS WHILE USING OR MISUSING THIS SOFTWARE.
|
||||||
|
# See the GNU General Public License for more details.
|
||||||
|
|
||||||
|
first_cover=""
|
||||||
|
|
||||||
|
while :
|
||||||
|
do
|
||||||
|
if [ -e ~/.conky/music/meta ]
|
||||||
|
then
|
||||||
|
new_cover=$(cat ~/.conky/music/meta | awk -F \' '{for(i=1;i<=NF;i++) if ($i=="mpris:artUrl") print $(i+2)}')
|
||||||
|
if [ "$new_cover" != "$first_cover" ]
|
||||||
|
then
|
||||||
|
first_cover="$new_cover"
|
||||||
|
wget -O ~/.conky/music/last_album_pic.png $new_cover
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
sleep 1
|
||||||
|
done
|
||||||
22
.conky/music/fetch_meta.sh
Executable file
@@ -0,0 +1,22 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
old_meta=""
|
||||||
|
while :
|
||||||
|
do
|
||||||
|
if [ "$(playerctl status)" = "Playing" ]
|
||||||
|
then
|
||||||
|
new_meta=$(playerctl metadata)
|
||||||
|
if [ "$new_meta" != "$old_meta" ]
|
||||||
|
then
|
||||||
|
echo $new_meta > ~/.conky/music/meta
|
||||||
|
old_meta=$new_meta
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
new_meta=''
|
||||||
|
if [ -e ~/.conky/music/meta ]
|
||||||
|
then
|
||||||
|
rm ~/.conky/music/meta
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
sleep 1
|
||||||
|
done
|
||||||
16
.conky/music/format.sh
Executable file
@@ -0,0 +1,16 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
width=26
|
||||||
|
space=' ' #20 spaces
|
||||||
|
if [ -f $META_FILE ]; then
|
||||||
|
artist=$(cat ~/.conky/music/meta | awk -F \' '{for(i=1;i<=NF;i++) if ($i=="xesam:artist") print $(i+2)}')
|
||||||
|
album=$(cat ~/.conky/music/meta | awk -F \' '{for(i=1;i<=NF;i++) if ($i=="xesam:album") print $(i+2)}')
|
||||||
|
title=$(cat ~/.conky/music/meta | awk -F \' '{for(i=1;i<=NF;i++) if ($i=="xesam:title") print $(i+2)}')
|
||||||
|
out=$(echo "$artist"; echo "$title"; echo "$album")
|
||||||
|
out=$(echo "$out" | fmt -s --width $width)
|
||||||
|
mapfile -t var <<< "$out"
|
||||||
|
for word in "${var[@]}"; do
|
||||||
|
echo "$space $word"
|
||||||
|
done
|
||||||
|
else
|
||||||
|
echo "Nothing playing"
|
||||||
|
fi
|
||||||
BIN
.conky/music/last_album_pic.png
Executable file
|
After Width: | Height: | Size: 35 KiB |
58
.conky/music/launcher.sh
Executable file
@@ -0,0 +1,58 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
fetch_art(){
|
||||||
|
first_cover=""
|
||||||
|
if [ -e ~/.conky/music/meta ]
|
||||||
|
then
|
||||||
|
new_cover=$(cat ~/.conky/music/meta | awk -F \' '{for(i=1;i<=NF;i++) if ($i=="mpris:artUrl") print $(i+2)}')
|
||||||
|
if [ "$new_cover" != "$first_cover" ]
|
||||||
|
then
|
||||||
|
first_cover="$new_cover"
|
||||||
|
wget -O ~/.conky/music/last_album_pic.png $new_cover
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
fetch_meta(){
|
||||||
|
old_meta=""
|
||||||
|
if [ "$(playerctl status)" = "Playing" ]
|
||||||
|
then
|
||||||
|
new_meta=$(playerctl metadata)
|
||||||
|
if [ "$new_meta" != "$old_meta" ]
|
||||||
|
then
|
||||||
|
echo $new_meta > ~/.conky/music/meta
|
||||||
|
old_meta=$new_meta
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
new_meta=''
|
||||||
|
if [ -e ~/.conky/music/meta ]
|
||||||
|
then
|
||||||
|
rm ~/.conky/music/meta
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
running=false
|
||||||
|
|
||||||
|
while :
|
||||||
|
do
|
||||||
|
if [ $running = false ]; then
|
||||||
|
spotify_process_id=$(pidof spotify)
|
||||||
|
if [[ ! -z $spotify_process_id ]]; then
|
||||||
|
~/.conky/music/fetch_meta.sh & meta_PID=$!
|
||||||
|
~/.conky/music/fetch_art.sh & art_PID=$!
|
||||||
|
running=true
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
spotify_process_id=$(pidof spotify)
|
||||||
|
if [[ -z $spotify_process_id ]]; then
|
||||||
|
if [[ ! -z $meta_PID ]]; then
|
||||||
|
kill $meta_PID
|
||||||
|
fi
|
||||||
|
if [[ ! -z $art_PID ]]; then
|
||||||
|
kill $art_PID
|
||||||
|
fi
|
||||||
|
running=false
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
sleep 10
|
||||||
|
done
|
||||||
63
.conky/music/playerctl_listen.py
Normal file
@@ -0,0 +1,63 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
import os
|
||||||
|
import urllib.request
|
||||||
|
import gi
|
||||||
|
gi.require_version('Playerctl', '2.0')
|
||||||
|
from gi.repository import Playerctl, GLib
|
||||||
|
|
||||||
|
metadata_file = '/tmp/kuba_now_playing'
|
||||||
|
album_cover_file = '/tmp/kuba_now_playing_cover.png'
|
||||||
|
manager = Playerctl.PlayerManager()
|
||||||
|
|
||||||
|
last_spotify_metadata = None # Keep this as spotify gives several notifications
|
||||||
|
|
||||||
|
def on_play(player, status, manager):
|
||||||
|
# print('player is playing: {}'.format(player.props.player_name))
|
||||||
|
pass
|
||||||
|
|
||||||
|
def download_album_cover(url):
|
||||||
|
image = urllib.request.urlopen(url)
|
||||||
|
with open(album_cover_file,'wb') as output:
|
||||||
|
output.write(image.read())
|
||||||
|
|
||||||
|
space = ' '
|
||||||
|
def on_metadata(player, metadata, manager):
|
||||||
|
global last_spotify_metadata
|
||||||
|
if last_spotify_metadata == metadata:
|
||||||
|
return
|
||||||
|
|
||||||
|
last_spotify_metadata = metadata
|
||||||
|
keys = metadata.keys()
|
||||||
|
with open(metadata_file, 'w+') as f:
|
||||||
|
f.write('{}{}\n'.format(space, metadata['xesam:artist'][0]))
|
||||||
|
f.write('{}{}\n'.format(space, metadata['xesam:title']))
|
||||||
|
f.write('{}{}'.format(space, metadata['xesam:album']))
|
||||||
|
download_album_cover(metadata['mpris:artUrl'])
|
||||||
|
|
||||||
|
def init_player(name):
|
||||||
|
# choose if you want to manage the player based on the name
|
||||||
|
if name.name in ['spotify']:#, 'vlc', 'cmus']:
|
||||||
|
player = Playerctl.Player.new_from_name(name)
|
||||||
|
#player.connect('playback-status::playing', on_play, manager)
|
||||||
|
player.connect('metadata', on_metadata, manager)
|
||||||
|
manager.manage_player(player)
|
||||||
|
return player
|
||||||
|
|
||||||
|
def on_name_appeared(manager, name):
|
||||||
|
player = init_player(name)
|
||||||
|
# if player != None:
|
||||||
|
# print('player has appeared: {}'.format(player.props.player_name))
|
||||||
|
|
||||||
|
def on_player_vanished(manager, player):
|
||||||
|
print('player has exited: {}'.format(player.props.player_name))
|
||||||
|
os.remove(metadata_file)
|
||||||
|
|
||||||
|
manager.connect('name-appeared', on_name_appeared)
|
||||||
|
manager.connect('player-vanished', on_player_vanished)
|
||||||
|
|
||||||
|
for name in manager.props.player_names:
|
||||||
|
init_player(name)
|
||||||
|
|
||||||
|
main = GLib.MainLoop()
|
||||||
|
main.run()
|
||||||
|
|
||||||
74
.conky/pacman.py
Normal file
@@ -0,0 +1,74 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
import pathlib
|
||||||
|
import re
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
LOG_PATH = pathlib.Path("/var/log/pacman.log")
|
||||||
|
|
||||||
|
def get_log():
|
||||||
|
with LOG_PATH.open() as fp:
|
||||||
|
log_text = fp.read()
|
||||||
|
|
||||||
|
return log_text
|
||||||
|
|
||||||
|
def parse_pacman_line(line):
|
||||||
|
if len(line) > 0:
|
||||||
|
date_str = line[:18]
|
||||||
|
pacman_indicator = line[19:27]
|
||||||
|
message = line[28:]
|
||||||
|
|
||||||
|
if pacman_indicator == "[PACMAN]":
|
||||||
|
parsed_date = datetime.strptime(date_str,
|
||||||
|
"[%Y-%m-%d %H:%M]")
|
||||||
|
return {"date": parsed_date,
|
||||||
|
"message": message}
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def get_last_sync(log_text):
|
||||||
|
last_sync = None
|
||||||
|
for line in log_text.split("\n"):
|
||||||
|
|
||||||
|
parsed_line = parse_pacman_line(line)
|
||||||
|
if parsed_line is None:
|
||||||
|
continue
|
||||||
|
else:
|
||||||
|
if parsed_line["message"] == "synchronizing package lists":
|
||||||
|
last_sync = parsed_line["date"]
|
||||||
|
|
||||||
|
return last_sync
|
||||||
|
|
||||||
|
def get_last_system_upgrade(log_text):
|
||||||
|
last_upgrade = None
|
||||||
|
for line in log_text.split("\n"):
|
||||||
|
parsed_line = parse_pacman_line(line)
|
||||||
|
if parsed_line is None:
|
||||||
|
continue
|
||||||
|
else:
|
||||||
|
if parsed_line["message"] == "starting full system upgrade":
|
||||||
|
last_upgrade = parsed_line["date"]
|
||||||
|
|
||||||
|
return last_upgrade
|
||||||
|
|
||||||
|
def get_diffs(t):
|
||||||
|
now = datetime.now()
|
||||||
|
diff = now - t
|
||||||
|
|
||||||
|
if diff.days != 0:
|
||||||
|
return "{}d".format(diff.days)
|
||||||
|
elif diff.seconds > 3600:
|
||||||
|
return "{}h".format(diff.seconds // 3600)
|
||||||
|
else:
|
||||||
|
return "{}m".format(diff.seconds // 60)
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
text = get_log()
|
||||||
|
last_sync = get_last_sync(text)
|
||||||
|
last_upgrade = get_last_system_upgrade(text)
|
||||||
|
last_sync_diff = get_diffs(last_sync)
|
||||||
|
last_upgrade_diff = get_diffs(last_upgrade)
|
||||||
|
|
||||||
|
print("S: {} U: {}".format(last_sync_diff,
|
||||||
|
last_upgrade_diff))
|
||||||
|
|
||||||
|
|
||||||
42
.conky/scripts/.passwords.json
Executable file
@@ -0,0 +1,42 @@
|
|||||||
|
{
|
||||||
|
"gmail":
|
||||||
|
{
|
||||||
|
"username":"jakub.fojt96@gmail.com",
|
||||||
|
"password":"donotforget"
|
||||||
|
},
|
||||||
|
"github":
|
||||||
|
{
|
||||||
|
"username": "kuben",
|
||||||
|
"password": "githubpassword",
|
||||||
|
"fav_repos" : ["madhur.github.com", "portablejekyll", "GAnalytics", "wunder-java", "msysgit-2.0.0"]
|
||||||
|
},
|
||||||
|
"feedly":
|
||||||
|
{
|
||||||
|
"access_token": "feedly_access_token",
|
||||||
|
"access_code" : "feedly_access_code",
|
||||||
|
"refresh_token" : "feedly_refresh_token"
|
||||||
|
},
|
||||||
|
"twitter":
|
||||||
|
{
|
||||||
|
"key": "xxxxxxxx",
|
||||||
|
"secret":"xxxxxxxx",
|
||||||
|
"access_token": "xxxxxxxx",
|
||||||
|
"user": "your twitter screen name"
|
||||||
|
},
|
||||||
|
"pocket":
|
||||||
|
{
|
||||||
|
"key": "xxxxx",
|
||||||
|
"request_token":"xxxxxx",
|
||||||
|
"access_token" : "xxxxxxx"
|
||||||
|
},
|
||||||
|
"so" :
|
||||||
|
{
|
||||||
|
"userid":"stackoverflow_user_id"
|
||||||
|
},
|
||||||
|
"weather":
|
||||||
|
{
|
||||||
|
"location_code":"SWXX0043"
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
}
|
||||||
10
.conky/scripts/disk.sh
Executable file
@@ -0,0 +1,10 @@
|
|||||||
|
cat /proc/mounts | awk '{ if ( $1 ~ /\/dev/ )
|
||||||
|
{
|
||||||
|
num_elem = split($2,str_array,"/")
|
||||||
|
if (str_array[num_elem] == "")
|
||||||
|
{
|
||||||
|
str_array[num_elem] = "/";
|
||||||
|
}
|
||||||
|
printf "%5.5s: ${fs_free %s} / ${fs_size %s}\n${fs_bar 6 %s}\n", str_array[num_elem], $2, $2, $2
|
||||||
|
}
|
||||||
|
}'
|
||||||
17
.conky/scripts/facebook.sh
Executable file
@@ -0,0 +1,17 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
msgcount=$(fbcmd NOTIFY | grep MESSAGES_UNREAD | grep -oE "[[:digit:]]{1,}")
|
||||||
|
notifycount=$(fbcmd NOTICES unread | grep -c :title)
|
||||||
|
friendcount=$(fbcmd NOTIFY | grep FRIEND_REQUESTS | grep -oE "[[:digit:]]{1,}")
|
||||||
|
currenttime=$(date +%I:%M)
|
||||||
|
|
||||||
|
if [[ "$msgcount" -eq "0" ]] && [[ "$notifycount" -eq "0" ]] && [[ "$friendcount" -eq "0" ]]
|
||||||
|
then
|
||||||
|
|
||||||
|
echo '${color}No new updates ${alignr}Updated: ${color white}'$currenttime
|
||||||
|
else
|
||||||
|
echo '${color white}'$msgcount'${color aaaaaa} NEW MESSAGE(S) ${alignr}Updated: ${color white}'$currenttime
|
||||||
|
echo '${color white}'$notifycount'${color aaaaaa} NEW NOTIFICATION(S)'
|
||||||
|
echo '${color white}'$friendcount'${color aaaaaa} NEW Friend Request(s)'
|
||||||
|
|
||||||
|
fi
|
||||||
71
.conky/scripts/feed_read.py
Executable file
@@ -0,0 +1,71 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
|
||||||
|
from feedly import FeedlyClient
|
||||||
|
import json
|
||||||
|
from subprocess import call
|
||||||
|
from os.path import expanduser
|
||||||
|
import time
|
||||||
|
|
||||||
|
#Categories to ignore, you can add yours
|
||||||
|
ignored=["global.all", "global.must", "global.uncategorized", "Security", "Ignore", "GMAT", "sharepoint", "madhur"]
|
||||||
|
|
||||||
|
FEEDLY_REDIRECT_URI = "http://localhost"
|
||||||
|
|
||||||
|
def get_feedly_client(token=None):
|
||||||
|
if token:
|
||||||
|
return FeedlyClient(token=token, sandbox=False)
|
||||||
|
else:
|
||||||
|
return FeedlyClient(
|
||||||
|
client_id=FEEDLY_CLIENT_ID,
|
||||||
|
client_secret=FEEDLY_CLIENT_SECRET,
|
||||||
|
sandbox=False
|
||||||
|
)
|
||||||
|
def auth(request):
|
||||||
|
feedly = get_feedly_client()
|
||||||
|
# Redirect the user to the feedly authorization URL to get user code
|
||||||
|
code_url = feedly.get_code_url(FEEDLY_REDIRECT_URI)
|
||||||
|
return redirect(code_url)
|
||||||
|
|
||||||
|
def callback(request):
|
||||||
|
code=request.GET.get('code','')
|
||||||
|
if not code:
|
||||||
|
return HttpResponse('The authentication is failed.')
|
||||||
|
|
||||||
|
feedly = get_feedly_client()
|
||||||
|
|
||||||
|
#response of access token
|
||||||
|
res_access_token = feedly.get_access_token(FEEDLY_REDIRECT_URI, code)
|
||||||
|
# user id
|
||||||
|
if 'errorCode' in res_access_token.keys():
|
||||||
|
return HttpResponse('The authentication is failed.')
|
||||||
|
|
||||||
|
id = res_access_token['id']
|
||||||
|
access_token=res_access_token['access_token']
|
||||||
|
|
||||||
|
def feed(access_token):
|
||||||
|
'''get user's subscription'''
|
||||||
|
feedly = get_feedly_client()
|
||||||
|
user_subscriptions = feedly.get_user_subscriptions(access_token)
|
||||||
|
|
||||||
|
|
||||||
|
json_data=open(expanduser('~')+'/.conky/scripts/.passwords.json')
|
||||||
|
data = json.load(json_data)
|
||||||
|
access_token=data['feedly']['access_token']
|
||||||
|
|
||||||
|
client = get_feedly_client(access_token)
|
||||||
|
categories = client.get_user_categories(access_token)
|
||||||
|
counts = client.get_unread_count(access_token)
|
||||||
|
|
||||||
|
text=""
|
||||||
|
count=0
|
||||||
|
for item in counts['unreadcounts']:
|
||||||
|
itemcount = item['count']
|
||||||
|
itemname = item['id'][51:]
|
||||||
|
if(itemcount > 0 and itemname not in ignored and "user/23bbb2c4-62b9-4bb9-a756-556cef1512f9/category/" in item['id']):
|
||||||
|
count = count + itemcount
|
||||||
|
text=text + "${color1}%s${alignr}${color white} %d" %(itemname, itemcount) +"\n"
|
||||||
|
|
||||||
|
print "${color1}Total unread: ${color white}%s ${alignr}${color1}Updated: ${color white}%s" %(count, time.strftime("%I:%M"))
|
||||||
|
print text
|
||||||
|
|
||||||
|
#call(['notify-send','Feedly Updated'])
|
||||||
123
.conky/scripts/feedly.py
Executable file
@@ -0,0 +1,123 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
import requests
|
||||||
|
import json
|
||||||
|
|
||||||
|
class FeedlyClient(object):
|
||||||
|
def __init__(self, **options):
|
||||||
|
self.client_id = options.get('client_id')
|
||||||
|
self.client_secret = options.get('client_secret')
|
||||||
|
self.sandbox = options.get('sandbox', True)
|
||||||
|
if self.sandbox:
|
||||||
|
default_service_host = 'sandbox.feedly.com'
|
||||||
|
else:
|
||||||
|
default_service_host = 'cloud.feedly.com'
|
||||||
|
self.service_host = options.get('service_host', default_service_host)
|
||||||
|
self.additional_headers = options.get('additional_headers', {})
|
||||||
|
self.token = options.get('token')
|
||||||
|
self.secret = options.get('secret')
|
||||||
|
|
||||||
|
def get_code_url(self, callback_url):
|
||||||
|
scope = 'https://cloud.feedly.com/subscriptions'
|
||||||
|
response_type = 'code'
|
||||||
|
|
||||||
|
request_url = '%s?client_id=%s&redirect_uri=%s&scope=%s&response_type=%s' % (
|
||||||
|
self._get_endpoint('v3/auth/auth'),
|
||||||
|
self.client_id,
|
||||||
|
callback_url,
|
||||||
|
scope,
|
||||||
|
response_type
|
||||||
|
)
|
||||||
|
return request_url
|
||||||
|
|
||||||
|
def get_access_token(self,redirect_uri,code):
|
||||||
|
params = dict(
|
||||||
|
client_id=self.client_id,
|
||||||
|
client_secret=self.client_secret,
|
||||||
|
grant_type='authorization_code',
|
||||||
|
redirect_uri=redirect_uri,
|
||||||
|
code=code
|
||||||
|
)
|
||||||
|
|
||||||
|
quest_url=self._get_endpoint('v3/auth/token')
|
||||||
|
res = requests.post(url=quest_url, params=params)
|
||||||
|
return res.json()
|
||||||
|
|
||||||
|
def refresh_access_token(self,refresh_token):
|
||||||
|
'''obtain a new access token by sending a refresh token to the feedly Authorization server'''
|
||||||
|
params = dict(
|
||||||
|
refresh_token=refresh_token,
|
||||||
|
client_id=self.client_id,
|
||||||
|
client_secret=self.client_secret,
|
||||||
|
grant_type='refresh_token',
|
||||||
|
)
|
||||||
|
quest_url=self._get_endpoint('v3/auth/token')
|
||||||
|
res = requests.post(url=quest_url, params=params)
|
||||||
|
return res.json()
|
||||||
|
|
||||||
|
|
||||||
|
def get_user_subscriptions(self,access_token):
|
||||||
|
'''return list of user subscriptions'''
|
||||||
|
headers = {'Authorization': 'OAuth '+access_token}
|
||||||
|
quest_url=self._get_endpoint('v3/subscriptions')
|
||||||
|
res = requests.get(url=quest_url, headers=headers)
|
||||||
|
return res.json()
|
||||||
|
|
||||||
|
def get_user_categories(self,access_token):
|
||||||
|
'''return list of user subscriptions'''
|
||||||
|
headers = {'Authorization': 'OAuth '+access_token}
|
||||||
|
quest_url=self._get_endpoint('v3/categories')
|
||||||
|
res = requests.get(url=quest_url, headers=headers)
|
||||||
|
return res.json()
|
||||||
|
|
||||||
|
def get_unread_count(self,access_token):
|
||||||
|
'''return list of user subscriptions'''
|
||||||
|
headers = {'Authorization': 'OAuth '+access_token}
|
||||||
|
quest_url=self._get_endpoint('v3/markers/counts')
|
||||||
|
res = requests.get(url=quest_url, headers=headers)
|
||||||
|
return res.json()
|
||||||
|
|
||||||
|
|
||||||
|
def get_feed_content(self,access_token,streamId,unreadOnly,newerThan):
|
||||||
|
'''return contents of a feed'''
|
||||||
|
headers = {'Authorization': 'OAuth '+access_token}
|
||||||
|
quest_url=self._get_endpoint('v3/streams/contents')
|
||||||
|
params = dict(
|
||||||
|
streamId=streamId,
|
||||||
|
unreadOnly=unreadOnly,
|
||||||
|
newerThan=newerThan
|
||||||
|
)
|
||||||
|
res = requests.get(url=quest_url, params=params,headers=headers)
|
||||||
|
return res.json()
|
||||||
|
|
||||||
|
def mark_article_read(self, access_token, entryIds):
|
||||||
|
'''Mark one or multiple articles as read'''
|
||||||
|
headers = {'content-type': 'application/json',
|
||||||
|
'Authorization': 'OAuth ' + access_token
|
||||||
|
}
|
||||||
|
quest_url = self._get_endpoint('v3/markers')
|
||||||
|
params = dict(
|
||||||
|
action="markAsRead",
|
||||||
|
type="entries",
|
||||||
|
entryIds=entryIds,
|
||||||
|
)
|
||||||
|
res = requests.post(url=quest_url, data=json.dumps(params), headers=headers)
|
||||||
|
return res
|
||||||
|
|
||||||
|
def save_for_later(self, access_token, user_id, entryIds):
|
||||||
|
'''saved for later.entryIds is a list for entry id.'''
|
||||||
|
headers = {'content-type': 'application/json',
|
||||||
|
'Authorization': 'OAuth ' + access_token
|
||||||
|
}
|
||||||
|
request_url = self._get_endpoint('v3/tags') + '/user%2F' + user_id + '%2Ftag%2Fglobal.saved'
|
||||||
|
|
||||||
|
params = dict(
|
||||||
|
entryIds=entryIds
|
||||||
|
)
|
||||||
|
res = requests.put(url=request_url, data=json.dumps(params), headers=headers)
|
||||||
|
return res
|
||||||
|
|
||||||
|
def _get_endpoint(self, path=None):
|
||||||
|
url = "https://%s" % (self.service_host)
|
||||||
|
if path is not None:
|
||||||
|
url += "/%s" % path
|
||||||
|
return url
|
||||||
42
.conky/scripts/github.py
Executable file
@@ -0,0 +1,42 @@
|
|||||||
|
#! /usr/bin/env python
|
||||||
|
|
||||||
|
import urllib2
|
||||||
|
import json
|
||||||
|
import zlib
|
||||||
|
import base64
|
||||||
|
from subprocess import call
|
||||||
|
from os.path import expanduser
|
||||||
|
import time
|
||||||
|
|
||||||
|
json_data=open(expanduser('~')+'/.conky/scripts/.passwords.json')
|
||||||
|
data = json.load(json_data)
|
||||||
|
username=data['github']['username']
|
||||||
|
password=data['github']['password']
|
||||||
|
|
||||||
|
#Put repos to publish
|
||||||
|
repos = data['github']['fav_repos']
|
||||||
|
|
||||||
|
request = urllib2.Request("https://api.github.com/users/" + username)
|
||||||
|
base64string = base64.encodestring('%s:%s' % (username, password)).replace('\n', '')
|
||||||
|
request.add_header("Authorization", "Basic %s" % base64string)
|
||||||
|
|
||||||
|
j = urllib2.urlopen(request)
|
||||||
|
json_data = j.read()
|
||||||
|
j_obj = json.loads(json_data)
|
||||||
|
|
||||||
|
|
||||||
|
print "${color1}Followers: ${color white}%d ${alignr}${color1}Updated: ${color white}%s" %(j_obj['followers'], time.strftime("%I:%M"))
|
||||||
|
|
||||||
|
for repo in repos:
|
||||||
|
repourl = "https://api.github.com/repos/"+ username +"/" + repo
|
||||||
|
|
||||||
|
request = urllib2.Request(repourl)
|
||||||
|
base64string = base64.encodestring('%s:%s' % (username, password)).replace('\n', '')
|
||||||
|
request.add_header("Authorization", "Basic %s" % base64string)
|
||||||
|
|
||||||
|
j = urllib2.urlopen(request)
|
||||||
|
json_data = j.read()
|
||||||
|
j_obj = json.loads(json_data)
|
||||||
|
|
||||||
|
print "${color white}%s ${goto 200}${color1}Starred: ${color white}%d ${color1}${alignr}Forks: ${color white}%d" %(j_obj['name'], j_obj['stargazers_count'], j_obj['forks_count'])
|
||||||
|
|
||||||
70
.conky/scripts/gmail.py
Executable file
@@ -0,0 +1,70 @@
|
|||||||
|
#! /usr/bin/env python3
|
||||||
|
|
||||||
|
import urllib.request
|
||||||
|
import urllib # For BasicHTTPAuthentication
|
||||||
|
import feedparser # For parsing the feed
|
||||||
|
from textwrap import wrap # For pretty printing assistance
|
||||||
|
import json
|
||||||
|
from os.path import expanduser
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
|
||||||
|
_URL = "https://mail.google.com/gmail/feed/atom/unread"
|
||||||
|
WRAP_LIMIT = 50
|
||||||
|
|
||||||
|
def auth():
|
||||||
|
|
||||||
|
json_data=open(expanduser('~')+'/.conky/scripts/.passwords.json')
|
||||||
|
data = json.load(json_data)
|
||||||
|
username=data['gmail']['username']
|
||||||
|
password=data['gmail']['password']
|
||||||
|
|
||||||
|
auth_handler = urllib.request.HTTPBasicAuthHandler()
|
||||||
|
auth_handler.add_password(realm='New mail feed',
|
||||||
|
uri='https://mail.google.com/',
|
||||||
|
user= username,
|
||||||
|
passwd= password)
|
||||||
|
|
||||||
|
opener = urllib.request.build_opener(auth_handler)
|
||||||
|
# ...and install it globally so it can be used with urlopen.
|
||||||
|
urllib.request.install_opener(opener)
|
||||||
|
|
||||||
|
'''The method to do HTTPBasicAuthentication'''
|
||||||
|
|
||||||
|
f = opener.open(_URL)
|
||||||
|
feed = f.read()
|
||||||
|
return feed
|
||||||
|
|
||||||
|
def fill(text, width):
|
||||||
|
'''A custom method to assist in pretty printing'''
|
||||||
|
if len(text) < width:
|
||||||
|
return text + ' '*(width-len(text))
|
||||||
|
else:
|
||||||
|
return text
|
||||||
|
|
||||||
|
def readmail(feed):
|
||||||
|
'''Parse the Atom feed and print a summary'''
|
||||||
|
atom = feedparser.parse(feed)
|
||||||
|
|
||||||
|
print ("${color white}You have %s new mails${color} ${alignr}Updated: ${color white}%s" % ((len(atom.entries)), time.strftime("%I:%M")))
|
||||||
|
|
||||||
|
for i in range(len(atom.entries)):
|
||||||
|
if(i>10):
|
||||||
|
break
|
||||||
|
if(len(atom.entries[i].title) > WRAP_LIMIT):
|
||||||
|
#print ("%s" % (fill(wrap(atom.entries[i].title, 50)[0]+" ...", 55)))
|
||||||
|
print ("${color1}%s" % (wrap(atom.entries[i].title, WRAP_LIMIT)[0]+" ..."))
|
||||||
|
else:
|
||||||
|
print ("${color1}%s" % (wrap(atom.entries[i].title, WRAP_LIMIT)[0]))
|
||||||
|
|
||||||
|
def countmail(feed):
|
||||||
|
'''Parse the Atom feed and print a summary'''
|
||||||
|
atom = feedparser.parse(feed)
|
||||||
|
print ("Emails: %s new" %len(atom.entries))
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
f = auth() # Do auth and then get the feed
|
||||||
|
if(len(sys.argv) > 1):
|
||||||
|
countmail(f)
|
||||||
|
else:
|
||||||
|
readmail(f) # Let the feed be chewed by feedparse
|
||||||
26
.conky/scripts/pocket.py
Executable file
@@ -0,0 +1,26 @@
|
|||||||
|
#! /usr/bin/env python
|
||||||
|
|
||||||
|
import urllib2
|
||||||
|
import urllib
|
||||||
|
import json
|
||||||
|
from subprocess import call
|
||||||
|
from os.path import expanduser
|
||||||
|
import time
|
||||||
|
|
||||||
|
json_data=open(expanduser('~')+'/.conky/scripts/.passwords.json')
|
||||||
|
data = json.load(json_data)
|
||||||
|
key=data['pocket']['key']
|
||||||
|
access_token=data['pocket']['access_token']
|
||||||
|
|
||||||
|
|
||||||
|
data = {'consumer_key': key, 'access_token': access_token}
|
||||||
|
data = urllib.urlencode(data)
|
||||||
|
|
||||||
|
request = urllib2.Request("https://getpocket.com/v3/stats")
|
||||||
|
|
||||||
|
j = urllib2.urlopen(request, data)
|
||||||
|
json_data = j.read()
|
||||||
|
j_obj = json.loads(json_data)
|
||||||
|
|
||||||
|
|
||||||
|
print "${color1}Pocket Unread: ${alignr}${color white}%d" %(j_obj['count_unread'])
|
||||||
31
.conky/scripts/so.py
Executable file
@@ -0,0 +1,31 @@
|
|||||||
|
#! /usr/bin/env python
|
||||||
|
|
||||||
|
import urllib2
|
||||||
|
import json
|
||||||
|
import zlib
|
||||||
|
from subprocess import call
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
from os.path import expanduser
|
||||||
|
|
||||||
|
json_data=open(expanduser('~')+'/.conky/scripts/.passwords.json')
|
||||||
|
data = json.load(json_data)
|
||||||
|
userid=data['so']['userid']
|
||||||
|
|
||||||
|
so = 'https://api.stackexchange.com/2.2/users/'+userid+'?order=desc&sort=reputation&site=stackoverflow'
|
||||||
|
|
||||||
|
j = urllib2.urlopen(so)
|
||||||
|
json_data = j.read()
|
||||||
|
if j.info()['Content-Encoding'] == 'gzip':
|
||||||
|
json_data = zlib.decompress(json_data, zlib.MAX_WBITS + 16)
|
||||||
|
j_obj = json.loads(json_data)
|
||||||
|
if(len(sys.argv) > 1):
|
||||||
|
print "%s: %s" %("Reputation", j_obj['items'][0]['reputation'])
|
||||||
|
else:
|
||||||
|
print "${color}%s: ${alignr}${color white} %s" %("Stackoverflow Reputation", j_obj['items'][0]['reputation'])
|
||||||
|
print " ${color}%s: ${alignr}${color white} %s" %("Month", j_obj['items'][0]['reputation_change_month'])
|
||||||
|
print " ${color}%s: ${alignr}${color white} %s" %("Week", j_obj['items'][0]['reputation_change_week'])
|
||||||
|
print " ${color}%s: ${alignr}${color white} %s" %("Day", j_obj['items'][0]['reputation_change_day'])
|
||||||
|
|
||||||
|
|
||||||
|
#call(['notify-send','Conky Updated'])
|
||||||
24
.conky/scripts/twitter.py
Executable file
@@ -0,0 +1,24 @@
|
|||||||
|
#! /usr/bin/env python
|
||||||
|
|
||||||
|
import urllib2
|
||||||
|
import json
|
||||||
|
import zlib
|
||||||
|
import base64
|
||||||
|
from subprocess import call
|
||||||
|
from os.path import expanduser
|
||||||
|
|
||||||
|
json_data=open(expanduser('~')+'/.conky/scripts/.passwords.json')
|
||||||
|
data = json.load(json_data)
|
||||||
|
access_token=data['twitter']['access_token']
|
||||||
|
user = data['twitter']['user']
|
||||||
|
|
||||||
|
request = urllib2.Request("https://api.twitter.com/1.1/users/show.json?screen_name=" + user)
|
||||||
|
bearer_value = 'Bearer %s' % access_token
|
||||||
|
request.add_header("Authorization", bearer_value)
|
||||||
|
|
||||||
|
j = urllib2.urlopen(request)
|
||||||
|
json_data = j.read()
|
||||||
|
j_obj = json.loads(json_data)
|
||||||
|
|
||||||
|
|
||||||
|
print "${color1}Twitter Followers: ${alignr}${color white}%d" %(j_obj['followers_count'])
|
||||||
53
.conky/weather/.conkyrc-weather1
Executable file
@@ -0,0 +1,53 @@
|
|||||||
|
conky.config = {
|
||||||
|
use_spacer='none',
|
||||||
|
use_xft=true,
|
||||||
|
font='Open Sans Light:size=11',
|
||||||
|
text_buffer_size=2048,
|
||||||
|
update_interval=3600.0,
|
||||||
|
total_run_times=0,
|
||||||
|
|
||||||
|
own_window=true,
|
||||||
|
own_window_transparent=true,
|
||||||
|
own_window_type='normal',
|
||||||
|
own_window_hints='undecorated,skip_taskbar,skip_pager',
|
||||||
|
own_window_class='Conky-weather',
|
||||||
|
own_window_argb_visual=true,
|
||||||
|
own_window_argb_value=0,
|
||||||
|
|
||||||
|
draw_shades=false,
|
||||||
|
draw_outline=false,
|
||||||
|
draw_borders=false,
|
||||||
|
stippled_borders=0,
|
||||||
|
double_buffer=true,
|
||||||
|
draw_blended=false,
|
||||||
|
|
||||||
|
default_color='white',
|
||||||
|
default_shade_color='black',
|
||||||
|
--Minimum size of text area
|
||||||
|
maximum_width=1200 ,
|
||||||
|
|
||||||
|
alignment=bottom_right,
|
||||||
|
gap_x=20,
|
||||||
|
gap_y=20,
|
||||||
|
border_inner_margin=15,
|
||||||
|
border_outer_margin=0,
|
||||||
|
|
||||||
|
no_buffers=true,
|
||||||
|
net_avg_samples=2,
|
||||||
|
|
||||||
|
override_utf8_locale=true,
|
||||||
|
|
||||||
|
use_spacer=none,
|
||||||
|
|
||||||
|
short_units=on,
|
||||||
|
|
||||||
|
default_color=white,
|
||||||
|
color1 = 'ffffff',
|
||||||
|
color7 = '333333'
|
||||||
|
-- default_outline_color='black',--'00ccee',
|
||||||
|
-- lua_load = '~/.conky/arch/lua.lua'
|
||||||
|
};
|
||||||
|
conky.text = [[
|
||||||
|
${execp ~/.conky/weather/weather.py 890869}
|
||||||
|
]];
|
||||||
|
|
||||||
53
.conky/weather/.conkyrc-weather2
Executable file
@@ -0,0 +1,53 @@
|
|||||||
|
conky.config = {
|
||||||
|
use_spacer='none',
|
||||||
|
use_xft=true,
|
||||||
|
font='Open Sans Light:size=11',
|
||||||
|
text_buffer_size=2048,
|
||||||
|
update_interval=3600.0,
|
||||||
|
total_run_times=0,
|
||||||
|
|
||||||
|
own_window=true,
|
||||||
|
own_window_transparent=true,
|
||||||
|
own_window_type='normal',
|
||||||
|
own_window_hints='undecorated,skip_taskbar,skip_pager',
|
||||||
|
own_window_class='Conky-weather',
|
||||||
|
own_window_argb_visual=true,
|
||||||
|
own_window_argb_value=0,
|
||||||
|
|
||||||
|
draw_shades=false,
|
||||||
|
draw_outline=false,
|
||||||
|
draw_borders=false,
|
||||||
|
stippled_borders=0,
|
||||||
|
double_buffer=true,
|
||||||
|
draw_blended=false,
|
||||||
|
|
||||||
|
default_color='white',
|
||||||
|
default_shade_color='black',
|
||||||
|
--Minimum size of text area
|
||||||
|
maximum_width=1200 ,
|
||||||
|
|
||||||
|
alignment=bottom_right,
|
||||||
|
gap_x=20,
|
||||||
|
gap_y=20,
|
||||||
|
border_inner_margin=15,
|
||||||
|
border_outer_margin=0,
|
||||||
|
|
||||||
|
no_buffers=true,
|
||||||
|
net_avg_samples=2,
|
||||||
|
|
||||||
|
override_utf8_locale=true,
|
||||||
|
|
||||||
|
use_spacer=none,
|
||||||
|
|
||||||
|
short_units=on,
|
||||||
|
|
||||||
|
default_color=white,
|
||||||
|
color1 = 'ffffff',
|
||||||
|
color7 = '333333'
|
||||||
|
-- default_outline_color='black',--'00ccee',
|
||||||
|
-- lua_load = '~/.conky/arch/lua.lua'
|
||||||
|
};
|
||||||
|
conky.text = [[
|
||||||
|
${execp ~/.conky/weather/weather.py 909319}
|
||||||
|
]];
|
||||||
|
|
||||||
1
.conky/weather/last-890869.json
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"timestamp": "1546242922", "location": {"city": "Gothenburg", "country": "Sweden", "region": " Vastra Gotaland"}, "wind": {"chill": "25", "direction": "165", "speed": "27.36"}, "atmosphere": {"humidity": "93", "pressure": "34473.45", "rising": "0", "visibility": "20.28"}, "astronomy": {"sunrise": "8:56 am", "sunset": "3:36 pm"}, "lat": "57.701328", "long": "11.96689", "condition": {"code": "26", "date": "Mon, 31 Dec 2018 08:00 AM CET", "temp": "0", "text": "Cloudy"}, "forecast": [{"code": "26", "date": "31 Dec 2018", "day": "Mon", "high": "7", "low": "-1", "text": "Cloudy"}, {"code": "24", "date": "01 Jan 2019", "day": "Tue", "high": "7", "low": "3", "text": "Windy"}, {"code": "32", "date": "02 Jan 2019", "day": "Wed", "high": "2", "low": "0", "text": "Sunny"}, {"code": "30", "date": "03 Jan 2019", "day": "Thu", "high": "1", "low": "-2", "text": "Partly Cloudy"}, {"code": "28", "date": "04 Jan 2019", "day": "Fri", "high": "6", "low": "0", "text": "Mostly Cloudy"}, {"code": "30", "date": "05 Jan 2019", "day": "Sat", "high": "4", "low": "0", "text": "Partly Cloudy"}, {"code": "28", "date": "06 Jan 2019", "day": "Sun", "high": "5", "low": "0", "text": "Mostly Cloudy"}, {"code": "30", "date": "07 Jan 2019", "day": "Mon", "high": "5", "low": "2", "text": "Partly Cloudy"}, {"code": "30", "date": "08 Jan 2019", "day": "Tue", "high": "3", "low": "0", "text": "Partly Cloudy"}, {"code": "30", "date": "09 Jan 2019", "day": "Wed", "high": "3", "low": "0", "text": "Partly Cloudy"}]}
|
||||||
1
.conky/weather/last-909319.json
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"timestamp": "1546242922", "location": {"city": "Vasteras", "country": "Sweden", "region": " Vastmanland"}, "wind": {"chill": "19", "direction": "190", "speed": "16.09"}, "atmosphere": {"humidity": "100", "pressure": "34541.18", "rising": "0", "visibility": "7.24"}, "astronomy": {"sunrise": "8:53 am", "sunset": "3:2 pm"}, "lat": "59.61998", "long": "16.53591", "condition": {"code": "26", "date": "Mon, 31 Dec 2018 08:00 AM CET", "temp": "-3", "text": "Cloudy"}, "forecast": [{"code": "28", "date": "31 Dec 2018", "day": "Mon", "high": "4", "low": "-3", "text": "Mostly Cloudy"}, {"code": "28", "date": "01 Jan 2019", "day": "Tue", "high": "6", "low": "0", "text": "Mostly Cloudy"}, {"code": "23", "date": "02 Jan 2019", "day": "Wed", "high": "0", "low": "-3", "text": "Breezy"}, {"code": "30", "date": "03 Jan 2019", "day": "Thu", "high": "-2", "low": "-5", "text": "Partly Cloudy"}, {"code": "28", "date": "04 Jan 2019", "day": "Fri", "high": "2", "low": "-3", "text": "Mostly Cloudy"}, {"code": "30", "date": "05 Jan 2019", "day": "Sat", "high": "3", "low": "-1", "text": "Partly Cloudy"}, {"code": "28", "date": "06 Jan 2019", "day": "Sun", "high": "1", "low": "-1", "text": "Mostly Cloudy"}, {"code": "28", "date": "07 Jan 2019", "day": "Mon", "high": "1", "low": "0", "text": "Mostly Cloudy"}, {"code": "30", "date": "08 Jan 2019", "day": "Tue", "high": "0", "low": "-3", "text": "Partly Cloudy"}, {"code": "30", "date": "09 Jan 2019", "day": "Wed", "high": "-1", "low": "-3", "text": "Partly Cloudy"}]}
|
||||||
84
.conky/weather/weather.py
Executable file
@@ -0,0 +1,84 @@
|
|||||||
|
#! /usr/bin/env python3
|
||||||
|
|
||||||
|
import urllib
|
||||||
|
import json
|
||||||
|
#from subprocess import call
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
import os.path
|
||||||
|
import urllib.parse
|
||||||
|
import urllib.request
|
||||||
|
#import time
|
||||||
|
|
||||||
|
woeid=sys.argv[1]
|
||||||
|
path="/home/kuba/.conky/weather/last-"+woeid+".json"
|
||||||
|
#Try to download new data; overwrite existing file
|
||||||
|
def fetch_data():
|
||||||
|
base_url = "https://query.yahooapis.com/v1/public/yql"
|
||||||
|
sel_url = "?q=select%20location,wind,atmosphere,astronomy,item%20from%20weather.forecast"
|
||||||
|
where_url = "%20where%20u='c'%20and%20woeid%20=%20" + woeid
|
||||||
|
format_url = "&format=json&env=store%3A%2F%2Fdatatables.org%2Falltableswithkeys"
|
||||||
|
yql_url = base_url + sel_url + where_url + format_url
|
||||||
|
try:
|
||||||
|
result = urllib.request.urlopen(yql_url).read()
|
||||||
|
data = json.loads(result)
|
||||||
|
ch = data['query']['results']['channel']
|
||||||
|
|
||||||
|
timestamp = str(round(time.time()))
|
||||||
|
parsed_data = {}#Don't save everything
|
||||||
|
parsed_data['timestamp'] = timestamp
|
||||||
|
for s in ['location', 'wind', 'atmosphere', 'astronomy']:
|
||||||
|
parsed_data.update({s : ch[s]})
|
||||||
|
for s in ['lat','long','condition','forecast']:
|
||||||
|
parsed_data.update({s : ch['item'][s]})
|
||||||
|
with open(path, 'w') as outfile:
|
||||||
|
json.dump(parsed_data, outfile)
|
||||||
|
except urllib.error.URLError:
|
||||||
|
pass
|
||||||
|
return
|
||||||
|
|
||||||
|
def print_output(data):
|
||||||
|
timestamp = int(data['timestamp'])
|
||||||
|
|
||||||
|
city = data['location']['city']
|
||||||
|
country = data['location']['country']
|
||||||
|
|
||||||
|
current_condition = data['condition']['text']
|
||||||
|
current_code = data['condition']['code']
|
||||||
|
current_temp = data['condition']['temp']
|
||||||
|
#weather_date = data['condition']['date']
|
||||||
|
|
||||||
|
speed_unit = "km/h"
|
||||||
|
speed = data['wind']['speed']
|
||||||
|
|
||||||
|
humidity = data['atmosphere']['humidity']
|
||||||
|
|
||||||
|
forecast = []
|
||||||
|
for forec in data['forecast']:
|
||||||
|
forecast.append((forec['day'], forec['low'], forec['high'] , forec['code']))
|
||||||
|
|
||||||
|
|
||||||
|
print ("${font Open Sans:size=15:style=Light}%s, %s " % (city, country), end = '')
|
||||||
|
print ("${font :size=6}${alignr}${color7}Updated: ${color white}%s ${color7}Fetched: ${color white}%s" % (time.strftime("%I:%M"), time.strftime("%m/%d %H:%M", time.gmtime(timestamp))))
|
||||||
|
print ("${color7}${hr}${color}")
|
||||||
|
print ("${voffset -5}${font Open Sans:size=60:style=Light}%s°${font}" %(current_temp))
|
||||||
|
print ("${offset 250}${voffset -65}%s" %(current_condition))
|
||||||
|
print ("${image ~/.conky/.conky-google-now/%s.png -p 180,45 -s 60x60}" %(current_code))
|
||||||
|
print ("${image ~/.conky/.conky-google-now/wind.png -p 245,62 -s 15x15}${goto 35}${offset 250}${voffset -12}%s %s" %(speed, speed_unit), end = '')
|
||||||
|
print ("${goto 400}%s ${goto 530} %s" %(forecast[0][0].upper(), forecast[1][0].upper()))
|
||||||
|
print ("${image ~/.conky/.conky-google-now/humidity.png -p 245,81 -s 15x15}${goto 35}${offset 250}%s %s" %(humidity,"%"), end = '')
|
||||||
|
print ("${goto 400}%s°${color6}%s°${color}${goto 530}%s°${color6}%s°${color}${voffset 15}" %(forecast[0][2], forecast[0][1], forecast[1][2], forecast[1][1]))
|
||||||
|
print ("${image ~/.conky/.conky-google-now/%s.png -p 440,65 -s 30x30}${image ~/.conky/.conky-google-now/%s.png -p 570,65 -s 30x30}${voffset -10}" %(forecast[0][3], forecast[1][3]))
|
||||||
|
return
|
||||||
|
|
||||||
|
#Open existing file and check if it's not too old
|
||||||
|
if not (os.path.isfile(path)):
|
||||||
|
fetch_data()
|
||||||
|
with open(path) as infile:
|
||||||
|
data = json.load(infile)
|
||||||
|
ts = int(data['timestamp'])
|
||||||
|
now = round(time.time())
|
||||||
|
if (ts + 3600 < now):
|
||||||
|
fetch_data()
|
||||||
|
ts = now
|
||||||
|
print_output(data)
|
||||||
4
Obrazy/Wallpapers/.directory
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
[Dolphin]
|
||||||
|
PreviewsShown=true
|
||||||
|
Timestamp=2013,5,20,10,21,15
|
||||||
|
Version=3
|
||||||
BIN
Obrazy/Wallpapers/13313344.jpg
Executable file
|
After Width: | Height: | Size: 673 KiB |
BIN
Obrazy/Wallpapers/1375441582033.jpg
Normal file
|
After Width: | Height: | Size: 341 KiB |
BIN
Obrazy/Wallpapers/1375441723497.jpg
Normal file
|
After Width: | Height: | Size: 429 KiB |
BIN
Obrazy/Wallpapers/1375442049691.png
Normal file
|
After Width: | Height: | Size: 334 KiB |
BIN
Obrazy/Wallpapers/1375443008786.jpg
Normal file
|
After Width: | Height: | Size: 375 KiB |
BIN
Obrazy/Wallpapers/1375444043375.jpg
Normal file
|
After Width: | Height: | Size: 957 KiB |
BIN
Obrazy/Wallpapers/1375444086530.jpg
Normal file
|
After Width: | Height: | Size: 253 KiB |
BIN
Obrazy/Wallpapers/1375444498752.jpg
Normal file
|
After Width: | Height: | Size: 472 KiB |
BIN
Obrazy/Wallpapers/1375446978442.jpg
Normal file
|
After Width: | Height: | Size: 293 KiB |
1
Obrazy/Wallpapers/1_main
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
/home/kuba/Obrazy/Wallpapers/AxEcN - Imgur.jpg
|
||||||
1
Obrazy/Wallpapers/2_web
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
OnBlH - Imgur.jpg
|
||||||
BIN
Obrazy/Wallpapers/3Yuz5 - Imgur.jpg
Normal file
|
After Width: | Height: | Size: 197 KiB |
1
Obrazy/Wallpapers/3_music
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
/home/kuba/Obrazy/Wallpapers/XQeqNhY.jpg
|
||||||
1
Obrazy/Wallpapers/4_work
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
/home/kuba/Obrazy/Wallpapers/Arp4x - Imgur.jpg
|
||||||
1
Obrazy/Wallpapers/5_terms
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
/home/kuba/Obrazy/Wallpapers/1375446978442.jpg
|
||||||
1
Obrazy/Wallpapers/6_stats
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
/home/kuba/Obrazy/Wallpapers/1375441723497.jpg
|
||||||
1
Obrazy/Wallpapers/7
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
/home/kuba/Obrazy/Wallpapers/[No Spoilers] GoT in the 90's - desktop wallpaper edition! 1920x1080 (source Moshi-kun.tumblr.com) - Imgur.jpg
|
||||||
1
Obrazy/Wallpapers/8
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
/home/kuba/Obrazy/Wallpapers/1375441582033.jpg
|
||||||
1
Obrazy/Wallpapers/9
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
/home/kuba/Obrazy/Wallpapers/1375443008786.jpg
|
||||||
BIN
Obrazy/Wallpapers/Arp4x - Imgur.jpg
Normal file
|
After Width: | Height: | Size: 364 KiB |
BIN
Obrazy/Wallpapers/AxEcN - Imgur.jpg
Normal file
|
After Width: | Height: | Size: 129 KiB |
BIN
Obrazy/Wallpapers/OnBlH - Imgur.jpg
Normal file
|
After Width: | Height: | Size: 95 KiB |
BIN
Obrazy/Wallpapers/XQeqNhY.jpg
Normal file
|
After Width: | Height: | Size: 234 KiB |
|
After Width: | Height: | Size: 456 KiB |
1
Obrazy/Wallpapers/default
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
/home/kuba/Obrazy/Wallpapers/1375444086530.jpg
|
||||||