〰 Tidal, Archiloque's feed reader
I recognize only Cogmind and Dwarf Fortress. Can anyone please identify others?
submitted by /u/bububoom [link] [comments]
With IBM InterConnect 2017 around the corner, it’s time to start building agendas and signing up for onsite activities. In my recent blog, I shared my experience of the DevOps workshops from the last two InterConnect conferences and this year’s workshops look even more interesting.
This conference this year is packed with the best IBM DevOps and industry experts, new topics, and great networking. The workshops are different from the regular breakout sessions because they are interactive, small-group sessions led by IBM DevOps experts who facilitate the discussion and provide input to promote learning with structured exercises and sharing ideas and experiences. PowerPoint decks will be in scarce supply!
Last year, we got an overwhelming response for our workshops. Here’s what we learned from you:
Interactive – You loved the format of the workshops because they allowed you to interact with your peers. One participant mentioned “Having perspectives from other organizations in the same place from a technology perspective and learning how hybrid cloud can solve some of these issues or alleviate pressures was among the best things in these workshops”
Collaboration – You valued group discussions, collaborative thinking, hands-on experience, practical and exercise based approach were big attractions. The feedback we received from participants ranged from “The DevOps workshops with their discussion and exercise based approach helped in better understanding of concepts” to “Liked the group discussions and hands-on experience” and “The workshops stimulated collaborative thinking in the participants, which led to great learning”
Effective Speakers - You liked the facilitators, their approach, their knowledge and the effectiveness of their responses to the questions asked. You enjoyed the Q&A with our experts. One participant mentioned, “Very knowledgeable facilitators, they had an answer to every question and their answers were short, simple but clear”
All things DevOps – You liked our themes - discussions on current DevOps process, models, and spreadsheets of ROI tools, reinforcement of concepts that are common amongst many companies, automated process scenarios and ideas around the ops/dev/DevOps relationship
We have also incorporated your feedback to ensure a better experience this year. In addition, some of you requested that the workshops be longer and have an industry perspective. So, this year we have a dedicated workshop to do just that: Hold Your Horses or Let Them Run Wild? A Financial Institution’s DevOps Journey.
Also, this year the DevOps workshops will follow the “Lean Coffee” format which is a structured, but agenda-less meeting. Participants gather, build an agenda, and begin talking. Conversations are directed and productive because the agenda for the meeting was democratically generated.
Take a few minutes this week to review the workshops and remember workshop registration is on a first come first served basis. After you register for the conference make sure you add the workshop to your InterConnect 2017 agenda to reserve your seat. You can read more about the workshops in detail here - http://bit.ly/ibmdevopsworkshops
Register for the workshops today at http://bit.ly/ibmworkshops
Note: You must be registered for InterConnect to attend the workshops: http://bit.ly/interconnect_2017
You can follow my twitter handle @dishagmittal or @ibmdevops to get latest updates on what’s new for DevOps at InterConnect 2017
Disha Garg Mittal
Content Strategist, IBM Cloud Marketing - DevOps
#100DoPP d31: Fresh new work laptop, minimally setup just for my next trip. Photo taken on iPod6. Posted from new laptop. #nofilter
At first glance, the history and evolution of the PC is long and broad. Let us first try and quantify our definition. Which would mean, in it’s simplest form “Personal Computer,” that is to say, a computer which was within reach of every regular person. Rather than the realm of prestigious universities, corporations and government or army departments. The definition is blurry, both now and the in the past, but let’s focus on this definition for now. If computers were invented to solve complex mathematical problems, it has certainly taken a backseat in modern PCs. So whilst the “Computer” has been around in one form or another since 1937, the history of the PC stretches back to 1975 (arguably).
If we are defining it simple as a Personal Computer, then the earlier example may be the Librascope LGP-30. Invented by the physicist Stan Frankel at Caltech, the Librascope LGP-30 was sold to defense contractor General Precision in 1956 for $50,000. Not exactly within reach of the average person. The reason it is dubbed a PC is because it was able to be used by just one person.
But using our definition, it could be argued that the real first PC was the MITS Altair 8800 kit. It sold for $297 in 1975 and was the first to use the name “Personal Computer,” a term coined by MITS co-founder Ed Roberts who invented the machine. It soon became the core of hobbyist computing and continued being produced until 1978.
The next step in PC evolution came soon after in 1976 with the introduction of the Video Display Module (VDM) at the Altair Convention. This visual display created the ability to visualize interactive games.
Seems like a blockbuster couple of years, because it was in 1977 the Apple II was introduced. Arguably, the Apple 1 came before, but the success and reality of the PC did not exist until the Apple II. It came with a power supply, keyboard and case. It could be connected to a color television set and was capable of producing incredible color graphics. Millions of Apple IIs were sold and it continued to be sold up until 1993. It found a place in homes, universities, schools and workplaces and thus earned its place as the first real PC.
The next step was when IBM introduced its own PC. Before, IBM was largely a player in the industrial, government and military space. Undoubtedly, after seeing the success and widespread adoption of the Apple II, it quickly realised the need to release its own PC. The IBM Model 5150 was released in August 1981 and looked remarkably similar to what PC owners today would now recognise as a modern computer. It also introduced MS-DOS as an operating system and revolutionised business computing. Many future successors within the ecosystem of PC software and peripherals can trace their lineage back to the IBM 5150.
But it was the Apple Lisa in 1983 which introduced the graphical user interface (GUI) which removed the last big hurdle for most home users. The Lisa itself was not a huge success but it led directly to both the Apple Macintosh in 1984 and all future Windows successors.
And it was in 1992 when Apple released its PowerBook series of laptops that the PC went mobile finally. Apple had attempted to create a portable PC in the past, with its Macintosh Portable in 1989, but it was heavy and expensive. It was with the release of the PowerBooks that the true form of the modern laptop PC was first seen. What we recognise today as a laptop can be traced back to the PowerBook. These early laptops came with a trackball, floppy drive and palm rests and continued until 2006. The Sony Vaio in 2004 and IBM ThinkPad T43 in 2005 are natural successors.
What next for PC’s? There was of course the invention of the PDA, and the mobile phone, then the smartphone and the tablet. All of which could be argued is a “PC.” The main continuing trend in modern, traditional PC’s though seems to be more power, more processing capabilities, better graphics. However, in many ways it has stalled. In terms of design, what you view as a PC is likely to look quite similar to what you had a decade ago. Maybe smoother, the performance faster, the graphics more intense. What happens next is difficult to predict and open to conjecture. Or perhaps the PC is growing irrelevant in our increasingly mobile world.
Photo Credit: PC Byte , Amber Case, Luc Legay
It would be difficult to find an industry more open to change than in manufacturing. A traditional early adopter of new technology and innovative methods. In fact, it is a requirement for a manufacturer to stay up-to-date with the changing trends and movements in order to stay competitive. Because if one manufacturer will not adopt the latest in efficient technology, then their competitor will. And they will find themselves soon in a desperate struggle to catch up - at which point it is too late. You would be hard-pressed to find much within the industries that has not changed over this time. And that rate of change only seems to be increasing. We are now moving into the fourth industrial revolution.
Where once a worker was required for every single task, where once a driver was required for every truck, car and forklift now exists a robot capable of doing the job as well as or better than any human counterpart. And if there is not yet an automated system in place, there soon will be. Even something as simple as a foreman with a clipboard and a checklist may soon find themselves using augmented reality to do the same job faster and more accurately.
Even the type of worker required, the skills and expertise needed are changing. Conjure up an image of a warehouse labourer and you’ll likely get a very different image to the kind of employee who is in demand now. Where once a worker was sought after for their strength and ability to work long hours now stands a button-down shirt kind of guy.
The largest employment growth opportunities in manufacturing is for STEM educated people. People who are built for programming, engineering and critical thinking. Who will be able to build, maintain and improve the robotic systems that will be put into place.
One trend that has reversed and is, in fact, mostly beneficial is the act of reshoring manufacturing. Where once it made economical sense to move the jobs and work of mass production into lower cost countries like China, the trend has begun to reverse. For one thing, these countries are growing up, finding their own economic and political footing and their middle class is growing. With that comes the demand for higher wages for the same work as well as new types of jobs. Combine this with the increasing efficiency and cost-effectiveness of new technologies and it is suddenly viable to produce goods in the original country, such as the USA, Australia and much of Europe.
Last-minute ordering was one of the biggest revolutions in the last century. The core idea was efficiency by producing what was needed exactly when it was needed. The result (ideally) is to reduce costs and enhance production times. There’s now a number of new methods which exist or are in development which will greatly enhance this - Predictive Analytics. This is the method by which advanced computer programs and at least partial AI is able to better predict the demand trends for products in order to better understand the when and where new products are required for production.
Once, the maintenance and efficiency of machinery was largely governed by checklists and schedules. A recent innovation, the Digital Twin idea has greatly enhanced the possibilities. By linking a machine to a “digital twin” and having in place sensors which can relay real-time data and information back to a program you can understand exactly what state it is in. The machine will essentially tell you when a part is close to being worn out, or an immediate alarm when it fails. So to can you gather data and analytics in order to enhance the efficiency and cost-effectiveness of the machine. This is an incredible use of the Internet of Things (IoT) way of thinking that will continue to improve.
The world has changed and is continuing to change. Manufacturers will be required to adopt or be left behind. A large part of ensuring your manufacturing capabilities stay up-to-date will be in hiring and training the right kind of work-force. Employees capable of implementing and leading these new technologies will be essential to your future success. When looking back at the way things were, it is difficult to imagine how you can keep up when so much has changed in such a short time. But it is essential that you do.
Photo credit: Flickr
Challenges with Microservices
The microservices provides a nice and easy way to separate out individual concerns/capabilities for building a solution. These can be developed independently using whatever technology the squads feel is right for delivering that function. But when you start building a complex solution in this model, there are few disadvantages or issues to deal with.
I’m trying to list few from my experience below.
Cultural and Organizational change
The foremost is to get the teams organized around microservices. Though each squad meets and communicates well within the team, it is important to ensure effective communication across the squads or teams.
Duplication of Efforts
Many a times if there is no good communication across teams, I’ve seen that there is reinventing of wheel or teams trying to solve the same problem. Soon we might have two or more options of the same functionality that’s used by different squads
Distributed Systems are inherently Complex
As with traditional distributed systems, the difficulty is with how to manage distributed systems as well as make sure the end to end value of the solution is delivered – for the same you need to make sure individually each services are scalable, available and performs at their best. We have to deal with N/W Latency, Fault Tolerance, Serialization overhead and of course fan out of Requests leads to increased n/w traffic.
Service Discovery, Visualization
The most important issue with this model is when you are building a complex solution, soon are sitting on a pile of microservices and the complexities associated with handling their integration. Service discovery, visualization becomes an important element to be included in such scenario.
Any cloud or SaaS service is economical to both the consumer and provider, only if you can operate it optimally. Microservices involves a lot of infrastructure/tooling and significant operational overhead. Each team the again does their own Dev-Ops automation, tools etc. So there has to be some governance across teams to bring in this efficiency with tools and operations. Essentially ensure repeatability and reliable automation - Everything must be defined in code, testable and repeatable.
With many microservices, the attack surface increases and it is important to consider Security also holistically than at a microservice level. How to ensure authenticated users are only provided access to protected resource and the identity context is neatly propogated across microservices is one key consideration. Secure engineering should take care of fixing loopholes for any man in the middle attacks. Collecting the logs from all microservices in one place is a pre-req to do security intelligence.
Debugging & Testing
How to ensure the entire set of microservices work to fulfil the end to end scenario requires all the services and the right versions are picked up and supports backward and forward compatibility. Service versioning is another important consideration here.
Overall I think technical challenges can be overcome easily. The important and difficult thing is to keep the communication across these teams in tact. The recommendation is to have a repeating scrum of scrums to take all the squads together forward.
· Check out crafting the cloud videos on challenges with microservices - by Kyle and Roland on their interesting insights on the topic. https://www.ibm.com/blogs/bluemix/2016/10/challenges-with-microservices-part1/
Hotjar, see how your visitors are really using your site. Try it free.
This check list will help you install, configure and use NPI 1.2.0 easily.
Please download the check list from here and take a print out to track the various tasks required.
NPI_1 2 0_installation_configuration_cheat_sheet.docx|View Details
For any issues, contact our integration experts:
Amanda Yap (email@example.com)
Amilia Robini Aruldas (firstname.lastname@example.org)
For more detailed information, see IBM Knowledge Center from here.
I will be at Emerald City Comic Con March 2-5 at the Topatoco table! Come say hi!!!
Nous avons des besoins importants et très urgents pour différents projets
Nous comptons donc vraiment sur vous…
Relations Presse / Communication
Nous avons aussi besoin de 1 ou 2 volontaires se débrouillant bien en com’ presse (rédaction de communiqué etc).
Nous avons aussi besoin de 1 ou 2 personnes à l’aise avec Twitter, pour surveiller un peu ce qui se passe sur le nôtre, et ailleurs…
Organisateur de projet
C’est vaste, mais j’ai besoin d’une organisateur de projets, avec un peu de temps, pour un projet précis.
Juristes Droit de la presse et Responsabilité Civile
Je discuterais bien avec plaisir avec des praticiens du Droit de la presse – encore plus ‘ils veulent nous donner un coup de main. J’ai déjà un excellent avocat, mais j’aimerais bien aller plus loin, donc des spécialistes de la Doctrine et de la Jurisprudence seraient bienvenus.
Contactez-nous ici en indiquant en objet le sujet sur lequel vous vous proposez…
Merci d’avance !
P.S. si on ne répond pas tout de suite, excusez-nous, n’hésitez pas à re-proposer une 2e fois, c’est trus dur niveau temps pour moi… Merci – on les aura !
A Critical Thought on Customization in Game Design Josh Bycer email@example.com
Continuing my talk on customization vs. personalization, we turn to customization. Allowing the player to decide how they play your game is not easy from a design point of view. I talked about how some choices can be considered too good, while others could be too bad. This is where we get into expert-level game design.
For the first part, you can find the link here: https://www.youtube.com/watch?v=dlV09e7Arx0
The post A Critical Thought on Customization in Game Design appeared first on Game Wisdom.
Long story short, this coming week’s DF RSS feed sponsorship was sold, but now it’s open. If you’ve got a cool product or service you want to promote to DF’s discerning audience, and can make a deal quick, get in touch.
If you look at the downloads page for libtcod, you can now see three new files:
As each commit is made to the libtcod repository, the appveyor ci process on a successful build will upload new versions of these with the same file name. Use them at your own risk, but it will allow you to get access to builds of the latest changes without having to wait for the next offical libtcod release.
submitted by /u/rmtew [link] [comments]
The OpenID Foundation membership has approved “Financial API – Part 1: Read Only API Security Profile” as an OpenID Implementer’s Draft. An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification.
The specification is available at:
The voting results were:
Approve – 47 votes
Object – 1 vote
Abstain – 2 votes
Total votes: 50 (out of 200 members = 25% > 20% quorum requirement)
— Michael B. Jones – OpenID Foundation Board Secretary
Source : Daily Motion, les-crises, 24/12/2016
Gabriel Galice, président de l’Institut international de recherches pour la paix à Genève: “Les Américains ont un plan qui est de remodeler le Moyen-Orient et c’est un projet de prise du pouvoir”
Gabriel Galice : Je voudrais donner 3 dates : 2003, janvier 2003, Georges W Bush décide de remodeler le Grand Moyen-Orient. Le Grand Moyen-Orient ça va du Maroc jusqu’au Pakistan. Et il attaque l’Irak, avec les arguments que vous connaissez dont on sait maintenant qu’ils étaient bidons. Il n’y avait pas d’armes de destructions massives.
2006, 19 décembre 2006, Time Magazine, qui n’est pas un journal pro-russe à ma connaissance, publie un rapport secret sur la Syrie dans le viseur de Bush. “2006” !!! C’est-à -dire 5 ans avant les “printemps arabes”. Et il est expliqué comment et pourquoi le gouvernement des Etats-Unis va soutenir les opposants et renverser le gouvernement d’Assad.
Bon … prenons acte d’une ingérence caractérisée avec des moyens … 1 milliards par an affecté à ça.
Et puis 3ème date : 2011, le Grand Moyen-Orient on continue à le charcuter. Cette fois c’est la Libye qui tombe. C’est les français et les britanniques qui se mettent en première ligne et dans cette même période on commence un nouveau programme sur la Syrie du côté des Etats-Unis, pour armer les combattants.
Voilà quand même des choses qui doivent être rappeler, parce que c’est pas seulement Poutine qui est dans cette affaire. Poutine est arrivé après, bien après … sauf qu’il avait un “Expression suisse” depuis les années septante (70). Voilà …
Animatrice : Vous nous dites que les grands méchants dans le fond ce sont les Etats-Unis …
Gabriel Galice : Je dis que les américains ont un plan qui a été de remodeler le Grand Moyen-Orient, je n’ai pas entendu une seule fois ce mot dans les commentateurs depuis des mois sur cette question là. Et ce projet c’est un projet de prise du pouvoir et de faire tomber tous les états qui étaient : baasist, indépendants, tortionnaires on est d’accord, mais avec un chaos beaucoup plus grand depuis qu’on a dégommé Saddam, dégommé Kadhafi et qu’on essaie de dégommer Bachar.
On a effectivement beaucoup plus de morts et beaucoup plus de réfugiés et beaucoup plus de drames humains. Voilà le résultat. Donc il faut quand même que vous soyez cohérents avec vos postulats de départ. Est-ce que vous soutenez ce projet là, ce dessein là, ou est-ce que vous considérez, comme moi je le considère, que toutes les ingérences ont été négatives. Et quand j’ai entendu le président Hollande se vanter dans un livre, qui a été écrit par un journaliste français, d’avoir contourné l’embargo de l’Union Européenne, qui interdisait de livrer des armes, je ne sais pas si c’est vraiment les meilleurs choses qu’on ait trouvé à faire pour calmer le conflit.
Voilà quelques élément qu’il faut livrer. C’est pas “que” les américains. C’est les français, c’est les britanniques.
Source : Daily Motion, les-crises, 24/12/2016
Voir aussi la page wikipédia de l’Institut international de recherches pour la paix à Genève
La revue de presse, où l’on ne censure même pas Le Monde. Merci à nos contributeurs.
Figured I'd share these especially since they use Gimp and concepts for it in regards to pixel art for those who never thought about doing tiles.
Setting up Gimp for pixel art
Dithering mask layers
I know this is more general gamedev stuff, but the older I get the more I find myself drawn to RLs with tiles or art with systems as indepth as the traditional ASCII games.
submitted by /u/deadlyhabit [link] [comments]
The Interim Fix for Maximo Asset Management 220.127.116.11 Build 003 is now available.
IF003 (TPAE_75011_IFIX.20170201-1300.psi.zip) is cumulative of all prior Interim Fixes for Maximo Asset Management 18.104.22.168.
Here is the location to download this interim fix:
The Interim Fix for Maximo Asset Configuration Manager 22.214.171.124 Build 001 is now available.
IF001 (ACM7640_ifixes.20170215-1241.zip) is the first Interim Fix for Maximo Asset Configuration Manager 126.96.36.199.
Here is the location to download this interim fix:
The Interim Fix for Maximo Asset Configuration Manager 188.8.131.52 Build 003 is now available.
IF003 (ACM7630_ifixes.20170213-1052.zip) is cumulative of all prior Interim Fixes for Maximo Asset Configuration Manager 184.108.40.206.
Here is the location to download this interim fix:
The Interim Fix for Maximo Asset Configuration Manager 220.127.116.11 Build 029 is now available.
IF029 (ACM7511_ifixes.20170213-1414.zip) is cumulative of all prior Interim Fixes for Maximo Asset Configuration Manager 18.104.22.168.
Here is the location to download this interim fix:
David Wondrich, writing for The Daily Beast:
I hate barstools.
OK, let me amend that. I like them well enough at 2:15 on a
Tuesday afternoon, when you can pull one up, lay a stack of bills
on the bar and let the afternoon pad away on quiet cat feet of
jukebox C&W and Crown Royal.
But when 6:30 p.m. rolls around and you’re trying to get a drink
and the bar is palisaded with a Trumpian wall of backs; when
putting in a simple drink order means you have to stick your head
into someone’s side eye-patrolled personal space and yell past
their ear; when reaching over the tight-packed shoulders to get
your Martini is like playing one of those rigged claw games —
then, barstools suck.
Never really thought about it before, but it really does suck trying to get a drink at a bar when all the stools are occupied.
I am using mingw64 and the command I am using is "g++ src/main.cpp -I%cd%/include"
I know you are supposed to use visual studio but I REALLY want to avoid using visual studio at ALL costs...
the error I am getting is as follows;
In file included from J:\ProgrammingStuff\C++\libtcodHelloWorld/include/libtcod.hpp:34:0, from src/main.cpp:1: J:\ProgrammingStuff\C++\libtcodHelloWorld/include/console.hpp:1771:27: warning: type attributes ignored after type is already defined [-Wattributes] friend class TCODLIBAPI TCODImage; ~~~~~~~~ C:\Users\Dustin\AppData\Local\Temp\ccYRLt70.o:main.cpp:(.text+0x2f): undefined reference to
__imp__ZN11TCODConsole8initRootEiiPKcb15TCOD_renderer_t' C:\Users\Dustin\AppData\Local\Temp\ccYRLt70.o:main.cpp:(.text+0x38): undefined reference toimpZN11TCODConsole14isWindowClosedEv' C:\Users\Dustin\AppData\Local\Temp\ccYRLt70.o:main.cpp:(.text+0x58): undefined reference to
__imp__ZN10TCODSystem13checkForEventEiP10TCOD_key_tP12TCOD_mouse_t' C:\Users\Dustin\AppData\Local\Temp\ccYRLt70.o:main.cpp:(.text+0x61): undefined reference toimpZN11TCODConsole4rootE' C:\Users\Dustin\AppData\Local\Temp\ccYRLt70.o:main.cpp:(.text+0x6e): undefined reference to
__imp__ZN11TCODConsole5clearEv' C:\Users\Dustin\AppData\Local\Temp\ccYRLt70.o:main.cpp:(.text+0x77): undefined reference toimpZN11TCODConsole4rootE' C:\Users\Dustin\AppData\Local\Temp\ccYRLt70.o:main.cpp:(.text+0x9d): undefined reference to
__imp__ZN11TCODConsole7putCharEiii17TCOD_bkgnd_flag_t' C:\Users\Dustin\AppData\Local\Temp\ccYRLt70.o:main.cpp:(.text+0xa6): undefined reference toimp_ZN11TCODConsole5flushEv' collect2.exe: error: ld returned 1 exit status
here is a picture containing as much relevant information as possible including my file structure, error message, and code;
any help would be most appreciated. :)
submitted by /u/deeredman1991 [link] [comments]
A handful of readers have inquired as to the whereabouts of Microsoft‘s usual monthly patches for Windows and related software. Microsoft opted to delay releasing any updates until next month, even though there is a zero-day vulnerability in Windows going around. However, Adobe did push out updates this week as per usual to fix critical issues in its Flash Player software.
In a brief statement this week, Microsoft said it “discovered a last minute issue that could impact some customers” that was not resolved in time for Patch Tuesday, which normally falls on the second Tuesday of each month. In an update to that advisory posted on Wednesday, Microsoft said it would deliver February’s batch of patches as part of the next regularly-scheduled Patch Tuesday, which falls on March 14, 2017.
On Feb. 2, the CERT Coordination Center at Carnegie Mellon University warned that an unpatched bug in a core file-sharing component of Windows (SMB) could let attackers crash Windows 8.1, and Windows 10 systems, as well as server equivalents of those platforms. CERT warned that exploit code for the flaw was already available online.
The updates from Adobe fix at least 13 vulnerabilities in versions of Flash Player for Windows, Mac, ChromeOS and Linux systems. Adobe said it is not aware of any exploits in the wild for any of the 13 flaws fixed in this update.
The latest update brings Flash to v. 22.214.171.124. The update is rated “critical” for all OSes except Linux; critical flaws can be exploited to compromise a vulnerable system through no action on the part of the user, aside from perhaps browsing to a malicious or hacked Web site.
Flash has long been a risky program to leave plugged into the browser. If you have Flash installed, you should update, hobble or remove Flash as soon as possible. To see which version of Flash your browser may have installed, check out this page.
The smartest option is probably to ditch the program once and for all and significantly increase the security of your system in the process. An extremely powerful and buggy program that binds itself to the browser, Flash is a favorite target of attackers and malware. For some ideas about how to hobble or do without Flash (as well as slightly less radical solutions) check out A Month Without Adobe Flash Player.
If you choose to keep and update Flash, please do it today. The most recent versions of Flash should be available from the Flash home page. Windows users who browse the Web with anything other than Internet Explorer may need to apply this patch twice, once with IE and again using the alternative browser (Firefox, Opera, e.g.).
Chrome and IE should auto-install the latest Flash version on browser restart (users may need to manually check for updates in and/or restart the browser to get the latest Flash version). Chrome users may need to restart the browser to install or automatically download the latest version. When in doubt, click the vertical three dot icon to the right of the URL bar, select “Help,” then “About Chrome”: If there is an update available, Chrome should install it then.
I've been looking for how to use the latest libtcod with visual studio 2015, for C++, but at this point, i did not found any solution on the internet and sometimes, i feel like the solution is obvious and everyone can figure it, but not me
I just downloaded the latest 1.6.2 win32 msvs, and the dependencies 'cause i felt like i could be useful. As said in the readme, i located the solution that, as i've seen in other libraries, might be supposed to generate the .lib file
But i get two issues ( well, a lot more, but this is the same issues for different files/projects ) :
"C1083 missing SDL.h ", well, i think i'm suppose to install SDL. I found this and, am i suppose to configure the dependencies for all the projects in the solution ?
"LNK1104 cannot open libtcod.lib " I don't really understand this error. Why would vs want to open this file that obviously doesn't exist because i think i'm trying to generate it ?
Finally, when i'll be able to compile this file... what whill happen next ? I'll finally be able to configure the library in an other project in order to use it ?
submitted by /u/lppento [link] [comments]
Nuxeo’s Eric Barroca on DAM innovation and interoperability
Better interoperability between Digital Asset Management and adjacent systems is one of my favorite topics (see my post on RDF and schema.org for DAM interoperability). I believe that this requires DAM vendors to work on standards or common “best practices” together. That’s why I love talking interoperability with DAM vendor people (and while we’re at it, why not talk about DAM in general?).
Note that I’m not affiliated with any DAM vendor except for my employer, Digital Collections – and I’m only talking for myself on this Web site, not representing my employer.
Eric Barroca is CEO at (open source) ECM and DAM vendor Nuxeo.
Tim: Would you mind telling us a little bit about you and your journey to becoming CEO of Nuxeo?
Eric: I started at Nuxeo in 2001 as VP of Operations. Since then, I’ve worn several hats until, in 2008, I was appointed CEO of today’s international company. I played a key role in the development of the Nuxeo Platform, as a digital asset platform.
Your product is called the “Nuxeo Platform”. Is it mostly targeted at large enterprise setups with a focus on custom development?
The Nuxeo platform is designed to be both highly scalable and cost effective, so it can be deployed by large enterprises as well as smaller departments and organizations. However, our platform is primarily targeted at large enterprises that need to manage large amounts of data or very complex assets.
Most of our customers chose Nuxeo because it can scale to meet the needs of today’s most demanding environments while providing an incredibly low total cost of ownership (TCO). One of the Nuxeo Platform strengths lies in its ability to rapidly build custom apps at the enterprise level: out of the box connectors to systems in place in the organization, developer tools to help customization & deployment… Also, our powerful REST API is a great way to build applications and integrate with the platform.
Our platform has been designed to be easily integrated in existing environments. We want to help our users to maximize the technologies they have in place, instead of replacing everything.
NoSQL, Web Components, cloud and APIs – Nuxeo technology seems to be pretty “bleeding edge”. Would you say that innovation is a major part of the Nuxeo “DNA”?
Totally. Nuxeo’s team is very fond of technology and innovation is at the heart of what we do. We’re always looking for newer, more performant tools that will bring new capabilities to our products. However, we prefer to think of our technology as more “leading edge” than “bleeding edge.” When you work with some of the world’s foremost companies, innovation cannot come at the expense of things like quality, performance and availability.
You tested the Nuxeo Platform with more than a billion documents. How large are the DAM systems you already rolled out to customers, can you share some numbers?
We’re currently working with a post-production company with 10Pb (!!) of archive video. This is a great example of how highly performant the platform can be. By the way, all of our benchmarks are also publicly available on benchmarks.nuxeo.com.
And, as you will see from our benchmarks, Nuxeo provides native integration with the leading NoSQL database, MongoDB, as a content and data storage back end. Using the Nuxeo Platform with MongoDB provides our customers the opportunity to build digital asset management applications with big data tools and processes capable of dealing with enormous data volumes at unmatched speeds.
This integration offers the best of digital asset management (versioning, access control, workflow, querying, metadata management, business logic, sharing, auditing, file conversion, etc.) with scalable, highly available storage to build enterprise content-centric applications. Organizations with large content store requirements leverage MongoDB to access features such as replication, zero downtime and multi-master writes. It also works well alongside Elasticsearch for advanced queries.
In your article Welcome to the Age of Deep Content, you’ve made the point that while we’ve become pretty good at sharing traditional files, digital business transformation requires very rich and structured content. What’s your approach towards sharing that structured content and metadata between systems?
This is one of the key tenets of the Nuxeo Platform. Our vision is that a successful digital asset platform should not be a stand-alone solution. Instead, it should be embedded into your existing IT stack, so that you can experience a multiplier effect. Nuxeo’s architecture allows you to maximize productivity by enabling users to work with and collaborate on content and digital assets regardless of where they are stored (e.g. Adobe, Salesforce, Box, Dropbox, OneDrive, Google Drive, etc.). We provide a number of ready-made Nuxeo connectors to instantly integrate existing asset stores.
A modern content management platform must keep pace by allowing for equally fast changes to its data model to fulfill new informational needs. In legacy systems, document types, metadata, properties and attributes must be defined in advance (“schema on write”) prior to adding any content to the repository. Changing or removing an existing attribute can cause the entire repository to be re-indexed from scratch and, even worse, cause existing end user applications to stop working. The Nuxeo Platform allows to build future-proof applications using a schema-flexible content model that can be faceted to easily adapt and change to meet new business requirements.
Fundamentally, we believe that existing systems shouldn’t be replaced, rather they should be integrated into a global digital asset platform that can both scale with the needs of your organization and enrich your assets so that they provide real value to your business.
The Nuxeo Platform supports CMIS. Which other (possibly DAM-specific) data exchange standards are supported? Are you looking into Linked Data standards as well?
As we’ve discussed above, digital asset platforms must be adaptable to today’s very demanding business environments and must be able to connect with other asset sources in the enterprise. Given this, Nuxeo has long been a proponent of open standards and was a member of the Technical Committee for the development of both the CMIS 1.0 and 1.1 standards.
More recently, we are again participating as a Technical Committee member in the development of a CMIS4DAM standard, which we believe will bring great value to our DAM customers.
Also the XMP standard is still widely spread and used in the industry. Our customers use it to edit portable metadata on the document itself. Those metadata are then extracted by our server and synchronised.
Are you already involved in developing or advancing DAM interoperability standards, or would you be open to?
Yes, Nuxeo is an active member of the OASIS Technical Committee that is developing the CMIS4DAM standard.
What’s your outlook for DAM innovations and the DAM market in 2017?
The first wave of digital asset solutions drove huge efficiency gains by enabling companies to digitize their assets and processes. And, now that their assets and processes are increasingly digital, companies are able to find new revenue streams and develop new products and services for their customers. Media companies are creating more assets and distributing them across more channels than ever before. Similarly, in other industries, traditional businesses are beginning to look and act more and more like media companies. Quite simply, the digital asset is the currency of the digital era.
As a result, the DAM market is rapidly evolving. Large, legacy vendors provide feature-rich products built on legacy foundations that don't scale, aren’t Cloud-friendly and don’t integrate easily into customers’ existing ecosystems. Smaller, niche vendors serve up point solutions that can address departmental problems, but don't scale and aren't designed to be extensible to the needs of the modern enterprise.
Today, most large enterprises have a massive and rapidly growing number of digital assets and they need rich information about these assets in order to effectively find and use them, including descriptors, rights, permissions, asset relationships, etc. But that information is often scattered across numerous legacy systems or buried in manual processes, making these assets effectively unusable. As a result, we believe there is a huge opportunity for market consolidation and for companies like ours to provide a platform that will scale to the needs of the enterprise, won’t make assets unfindable or unusable, and won’t require customers to “rip and replace” existing systems and asset stores.
The future of DAM is to be able to unlock the full value of enterprise digital assets, by embedding business processes, along with digital assets, in the cloud, addressing it with APIs to easily connect into a fast evolving ecosystem of services and putting analytics and predictions on top.
Thanks to Eric Barroca and the team at Nuxeo for taking the time! By the way, I’m collecting articles about Nuxeo on Planet DAM.
Sun, 19 Feb 2017 20:36:00 +0000
Seems like everyone I know is blue and grouchy and angry; can’t say as I
blame them. But it’s time to turn a corner, because the future’s just as long
as ever, and we need joy to face it. Let me see if I can help.
Canada’s first few crocuses are up!
Yes, I did blog about the spring crocuses in
Clearly I need to remediate 2016’s lacklustre performance.
Once again, as I often do, I should echo the question from John Crowley’s awesome
Little, Big (seriously, one of the best
books): “What is Brother North-Wind’s secret?” The answer: “If Winter comes,
Spring can’t be far behind.”
This winter, our discontent has been political mostly. Lots of wars and
lies and pain to be sad about, but most sharply felt: 62,985,106
Americans, about 25.4% of the potential electorate, thought it was
OK to vote for That Man.
I’m sad too. And about Syria and Brexit and our sick elderly cat and my
children’s foibles and global warming and destructive inequality and the fact
that people still in 2017 think God wants them to kill other people.
But enough of that; today we’re in this blog’s silver-lining department. So
here are a few more things to smile about.
What with the Women’s March and so on, the angry and disappointed have
learned that they’re not crazy and not alone.
The explosion of unrest and anger has educated people around the world
as to how non-monolithic America is.
The proportion of people around the world who’ve realized that
Elections Have Consequences is noticeably higher than a few months
Often I hear good new music on the car radio while I’m driving around.
For example, I recommend
There’s good old music too! The Rolling Stones made a
pure blues record and it’s not
There are a lot of good books being written. For example, I recommend
Do Not Say We Have Nothing.
There’s a lot of really good stuff on TV. For example, I recommend
Look around you; there are good people in the world.
Well, and another crocus.
I’ll watch the forecast for sunshine once they’ve
opened,and take some more.
Seriously, let’s grant that there are really unhappy trends stinking up
the landscape. And that if we want to be part of the solution, it’s going to
be a lot of uphill work with, doubtless, downhill slips. But it’s worth
doing, and for reasons of mental health, and long-term survival, and pure
propaganda, I’m going to try to walk into 2017 with a smile.
A Talk on Steam Direct with Positech Games Josh Bycer firstname.lastname@example.org
This week on the cast, Cliff Harris of Positech Games returned to talk about his upcoming game Production Line and his thoughts on Steam Direct’s changes to the Steam marketplace.
After a brief catch up, we jumped right into talking about Steam Direct. Cliff shared his thoughts on why he’s a supporter of the fee and the problem with the perception of being on Steam these days. We discussed the challenges of being an Indie developer and some of the common issues we see.
After that, we talked about the work he’s doing on his upcoming game Production Line. I asked him about his inspiration and plans for the coming months’ of work on it.
The post A Talk on Steam Direct with Positech Games appeared first on Game Wisdom.
Modifies Queue names in the Java Messaging Services configuration of the application server in which MDM is installed.
When to use:
When MDM is configured to use WebSphere Messaging Queue (WMQ) certain default queue names are used. To modify this configuration to use custom queue names the post configuration target modify_default_queues can be used.
The target reads the below properties from file
Uninstalls configuration related to WebSphere Embedded Messaging (WEM) and configures the application server to make use of WebSphere Messaging Queues.
When to use:
When MDM has be installed using WebSphere Embedded Messaging (WEM) and has to be configured to use WebSphere Messaging Queue (WMQ) the target switch_to_mq can be used.
The target reads the below properties from file
La Cinémathèque Française organise la rétrospective d'une réalisatrice pour mieux la dévaloriser.
Did some weightlifting mid-run yesterday at Tennessee Valley Beach. #RESIST #nofilter.#rock #rocks #river #beach #message #tenneseevalley #tennesseevalleybeach #yestergram #heycreatedaily #imadethis