Human Generated Data

Title

Untitled (Mao Diptych #2)

Date

2009

People

Artist: Zhang Dali, Chinese born 1963

Publisher: Pace Editions, Inc., American

Publisher: Ethan Cohen Fine Arts,

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Richard Solomon and Ethan Cohen, 2010.480.2

Human Generated Data

Title

Untitled (Mao Diptych #2)

People

Artist: Zhang Dali, Chinese born 1963

Publisher: Pace Editions, Inc., American

Publisher: Ethan Cohen Fine Arts,

Date

2009

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Richard Solomon and Ethan Cohen, 2010.480.2

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Human 99.4
Person 99.4
Person 99.4
Person 99.3
Person 98.3
Military 97.4
Person 97.3
Military Uniform 96.7
Officer 95.5
Person 91.1
Architecture 88.8
Building 88.8
Face 88.1
Person 85.3
Grass 79
Plant 79
Clothing 78.2
Apparel 78.2
People 61.9
Footwear 60.8
Shoe 60.8
Shoe 59.7
Captain 58.5
Advertisement 55.3
Poster 55.3
Person 53.7
Person 46.3

Clarifai
created on 2018-03-23

people 99.7
man 96.3
group together 91.5
adult 90.4
group 89.3
child 88.8
administration 88.6
uniform 88.2
leader 84.2
military 83.9
many 77.7
woman 76.5
family 75.3
war 74.9
home 74.8
portrait 74.2
outfit 74.1
wear 73.3
boy 73.1
recreation 69

Imagga
created on 2018-03-23

man 45.7
male 34.1
sport 29.3
grass 27.7
military uniform 27
outdoors 26.9
person 26.7
uniform 23.6
people 23.4
player 22
clothing 20
leisure 19.9
golf 19.1
golfer 18.7
sky 17.9
outdoor 17.6
active 17.4
recreation 17
club 17
ball 17
field 16.7
adult 16.4
play 16.4
businessman 15.9
athlete 15.7
swing 15.7
day 15.7
park 15.6
playing 14.6
business 14.6
musical instrument 14.4
professional 14.4
ballplayer 14.4
men 13.7
course 13.6
game 12.5
covering 12.4
walking 12.3
boy 12.2
wind instrument 12.1
suit 12
outside 12
competition 11.9
consumer goods 11.8
tee 11.7
summer 11.6
happy 11.3
equipment 11.3
success 11.3
fun 10.5
accordion 10.1
contestant 9.9
activity 9.9
golfing 9.8
hit 9.7
standing 9.6
photographer 9.3
sports 9.3
exercise 9.1
environment 9
family 8.9
couple 8.7
executive 8.5
hobby 8.5
action 8.3
world 8.3
danger 8.2
copy space 8.1
weapon 8.1
keyboard instrument 8.1
handsome 8
back 7.9
device 7.8
practice 7.7
corporate 7.7
driving 7.7
hiking 7.7
walk 7.6
relax 7.6
drive 7.6
senior 7.5
megaphone 7.5
manager 7.5
landscape 7.4
holding 7.4
gun 7.3
freedom 7.3
spring 7.1
work 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

grass 99.8
outdoor 95.1
person 89.8
standing 87.6
posing 38.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Male, 55%
Happy 53.7%
Confused 45.3%
Calm 45.1%
Disgusted 45.3%
Angry 45.4%
Sad 45.1%
Surprised 45.2%

AWS Rekognition

Age 57-77
Gender Male, 50.2%
Disgusted 45.4%
Happy 49.5%
Confused 46.3%
Calm 46.4%
Sad 45.6%
Surprised 45.9%
Angry 45.9%

AWS Rekognition

Age 38-57
Gender Male, 54%
Angry 45.1%
Surprised 45.1%
Confused 45.1%
Disgusted 45%
Happy 54.7%
Sad 45.1%
Calm 45%

AWS Rekognition

Age 26-44
Gender Male, 55%
Calm 48.6%
Sad 45.4%
Confused 45.4%
Disgusted 45.1%
Happy 49.5%
Surprised 45.5%
Angry 45.5%

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 44
Gender Female

Microsoft Cognitive Services

Age 50
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 60.8%

Text analysis

Amazon

histhan?