• 0

w3c validation insanity


Question

OK so I've been working on this site for a university project and i've come to validate the XHTML and found this: http://validator.w3.org/check?uri=http%3A%2F%2Fjusebox.tk%2Fbrowse.php&charset=%28detect+automatically%29&doctype=Inline&group=0&user-agent=W3C_Validator%2F1.2

Most of (if not all) of these errors are due to either characters in URLs which occur in javascript, or characters which occur in RSS feeds, which I am echoing out via PHP. I have absolutely no idea how to tackle this issue and get this to validate. Can anyone help me out?

the page is at http://jusebox.tk/browse.php

Cheers :)

Link to comment
https://www.neowin.net/forum/topic/1000704-w3c-validation-insanity/
Share on other sites

12 answers to this question

Recommended Posts

  • 0

I'd recommend one of two things:

1. Put your javascript in a separate .js file and load it. This gets around all possible problems of the kind you're having.

or

2. Use the CDATA feature, e.g.

<script type="text/javascript">

<![CDATA[

// Your javascript code

]]></script>

  • 0

this si how my js works:

&lt;script type="text/javascript"&gt;
		var last_url;

		$(document).ready(function(){
		   manage_goto();
		});

		function manage_goto(){
			$('#go_to').click(function(){
			  last_url =  $('#url').val();
			  load_page();
			});
			$('.all_feeds a').click(function(){
			   last_url = $(this).attr('href');
			   load_page();
			});
		}

		function load_page(){

			update_fb_link();

		}


		function update_fb_link(){
			var share_url = 'http://www.facebook.com/sharer.php?t=Sharing&u='+last_url+'&title=via+Jusebox'#fb_link').attr('href', share_url);			
		}

I've tried changing the ampersands but as I said it renders the links useless.

Also, I don't actually know how to write php lol. I have a friend who helped me out with it but I can;t get through to him at the moment. I read those links you sent me and although I understand how it works, I have know idea how to implement it. This is my code:

&lt;?php
					$doc = new DOMDocument();
					$doc-&gt;load("http://feeds.eonline.com/eonline/uk/topstories?format=xml");
					$arrFeeds = array();
					foreach ($doc-&gt;getElementsByTagName('item') as $node) {
						$itemRSS = array ( 
							'title' =&gt; $node-&gt;getElementsByTagName('title')-&gt;item(0)-&gt;nodeValue,
							'link' =&gt; $node-&gt;getElementsByTagName('link')-&gt;item(0)-&gt;nodeValue,
							'description' =&gt; strip_tags($node-&gt;getElementsByTagName('description')-&gt;item(0)-&gt;nodeValue),
							);
						array_push($arrFeeds, $itemRSS);
					}

					foreach ($arrFeeds as $value) {
						echo "&lt;a class='rss_link' target='frame' href='".$value[link]."'&gt;".$value[title]."&lt;/a&gt;";
						echo	substr("&lt;p class='rss_description'&gt;".$value[description], 0, 200).'...&lt;/p&gt;';
					}

					?&gt;

any idea how i should go about this? Anything anyone can come up with is really appreciated! My deadline is nearing lol :s

  On 27/05/2011 at 20:31, DonC said:

I'd recommend one of two things:

1. Put your javascript in a separate .js file and load it. This gets around all possible problems of the kind you're having.

or

2. Use the CDATA feature, e.g.

<script type="text/javascript">

<![CDATA[

// Your javascript code

]]></script>

thats a really good idea, having the js in a separate file. I'll try that now. thanks :)

  • 0

really? I just created separate js files for the offending script and it works like a charm.

just out of interest (i'm learning, lol) would it be better practice to use the CDATA tags and if so why?

Also, is there anything you could suggest to treat my php problem? thanks :blush:

  • 0

What I meant to say is when you print links use htmlspecialchars function to encode special characters like & so that it validates with W3C. When you are using this data in JavaScript perform a global replace of & with & and then proceed. That way the scripts will work and the page will validate.

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.