Human Generated Data

Title

Piccadilly Circus

Date

2003

People

Artist: Paul McCarthy, American born 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, The Jorie Marshall Waterman '96 and Gwendolyn Dunaway Waterman '92 Fund, 2004.46.9

Copyright

© Paul McCarthy

Human Generated Data

Title

Piccadilly Circus

People

Artist: Paul McCarthy, American born 1945

Date

2003

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-05-28

Human 98.9
Person 98.9
Wood 95.6
Home Decor 95.5
Person 93.3
Plywood 93.2
Clothing 91.1
Apparel 91.1
Flooring 81.5
Indoors 76.5
Workshop 75.9
Building 74.4
Urban 70.9
Housing 70.1
Shoe 63.6
Footwear 63.6
Furniture 62.8
Room 60
Town 58.9
Road 58.9
City 58.9
Street 58.9
Floor 58.1
Couch 56.3
Pants 55.3
Shoe 55

Imagga
created on 2022-05-28

man 29.6
person 23.4
people 22.3
happy 20
adult 19.3
lifestyle 17.3
male 16.7
smiling 16.6
indoors 15.8
cheerful 13.8
sitting 13.7
portrait 12.9
home 12.7
work 12.5
working 12.4
smile 12.1
men 12
women 11.9
room 11.8
fun 11.2
looking 11.2
black 10.9
child 10.9
guy 10.7
business 10.3
clothing 10.3
professional 9.9
family 9.8
job 9.7
businessman 9.7
boy 9.6
happiness 9.4
youth 9.4
vehicle 9.3
worker 9.1
attractive 9.1
team 9
group 8.9
leisure activity 8.8
together 8.8
repair 8.6
holding 8.2
occupation 8.2
table 8
kid 8
to 8
shop 7.9
color 7.8
pretty 7.7
jeans 7.6
two 7.6
active 7.6
student 7.5
house 7.5
human 7.5
leisure 7.5
outdoors 7.5
bedroom 7.4
recreation 7.2
interior 7.1
day 7.1

Google
created on 2022-05-28

Flash photography 87.5
Building 86.6
Flooring 82.5
Floor 81.9
Art 75.4
Space 74.5
Darkness 73.8
Event 73.4
Entertainment 70.7
Performing arts 70
Room 67.6
Fashion design 65.7
Visual arts 63.6
Performance art 63
City 62.1
Night 60.2
Artist 59.8
Winter 59.7
Street 56.5
Music venue 55.1

Microsoft
created on 2022-05-28

snow 96.8
clothing 90.8
drawing 89.7
person 86.6
painting 83.1
cartoon 83
snowboarding 77

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 54.5%
Happy 35.2%
Angry 31.7%
Sad 15.8%
Surprised 9.3%
Fear 7.2%
Calm 4.3%
Disgusted 3.9%
Confused 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Shoe 63.6%

Captions

Microsoft

a man riding a snowboard down the side of a building 57.4%
a man that is standing in the snow 57.3%
a group of people that are standing in the snow 57.2%

Text analysis

Amazon

GENHE

Google

MUS
MUS SENHE ARS
ARS
SENHE