Human Generated Data

Title

Piccadilly Circus

Date

2003

People

Artist: Paul McCarthy, American born 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, The Jorie Marshall Waterman '96 and Gwendolyn Dunaway Waterman '92 Fund, 2004.46.5

Copyright

© Paul McCarthy

Human Generated Data

Title

Piccadilly Circus

People

Artist: Paul McCarthy, American born 1945

Date

2003

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-05

Person 99.2
Human 99.2
Wood 98.6
Person 97.4
Plywood 96.3
Apparel 95
Footwear 95
Clothing 95
Shoe 95
Furniture 92.6
Couch 92.2
Shoe 91.3
Person 85.2
Flooring 82.7
Indoors 77.4
Interior Design 77.4
Hardwood 74.3
Room 64.6
Figurine 64.6
Overcoat 60
Coat 60

Clarifai
created on 2018-04-19

people 97.8
furniture 96.2
room 95
indoors 94.9
man 94.6
family 94.1
adult 91.1
chair 90.2
woman 88.1
child 86.8
seat 85.7
inside 85.3
wood 84.9
house 84.7
business 84.5
sofa 83.5
sit 83.4
home 83.3
group 82.8
table 81.7

Imagga
created on 2018-04-19

executive 51
man 34.3
business 27.3
male 25.5
people 25.1
businessman 23.8
person 22.3
office 22.3
work 19.8
corporate 19.8
happy 18.8
laptop 18.2
suit 17.6
professional 17.6
worker 17.3
sitting 17.2
adult 17.1
table 16.4
meeting 16
group 15.3
men 14.6
smile 14.2
computer 13.6
businesswoman 13.6
home 12.8
outdoors 12.7
team 12.5
job 12.4
handsome 11.6
smiling 11.6
lifestyle 11.6
working 11.5
cheerful 11.4
couple 11.3
manager 11.2
communication 10.9
attractive 10.5
businesspeople 10.4
standing 10.4
desk 10.4
happiness 10.2
successful 10.1
together 9.6
house 9.2
confident 9.1
portrait 9.1
technology 8.9
looking 8.8
notebook 8.6
workplace 8.6
casual 8.5
teacher 8.5
relax 8.4
speaker 8.1
success 8
family 8
indoors 7.9
conference 7.8
seller 7.8
discussion 7.8
colleagues 7.8
pretty 7.7
industry 7.7
adults 7.6
presentation 7.4
board 7.4
black 7.2
women 7.1

Google
created on 2018-04-19

furniture 86.8
table 72.9
design 64.9
interior design 55.4
fun 53.5
chair 50.4
product 50.2

Microsoft
created on 2018-04-19

indoor 93.8
person 89.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-55
Gender Male, 97.1%
Calm 14%
Confused 8.2%
Angry 14.3%
Sad 50%
Surprised 3.4%
Disgusted 7.6%
Happy 2.5%

AWS Rekognition

Age 35-52
Gender Male, 52.8%
Happy 48%
Sad 45.8%
Calm 49.5%
Angry 45.5%
Disgusted 45.6%
Surprised 45.4%
Confused 45.2%

AWS Rekognition

Age 26-43
Gender Male, 55%
Happy 45%
Confused 45.1%
Disgusted 45.1%
Surprised 45.1%
Calm 51.5%
Angry 48%
Sad 45.2%

Microsoft Cognitive Services

Age 50
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Shoe 95%
Couch 92.2%

Captions

Microsoft

a person standing in front of a cake 65.2%
a man and a woman standing in front of a cake 48.6%
a person standing in front of a cake 48.5%