Human Generated Data

Title

Untitled (man and woman standing with three smiling children)

Date

1967, printed later

People

Artist: Milton Rogovin, American 1909 - 2011

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Philip Greider, 2011.588

Human Generated Data

Title

Untitled (man and woman standing with three smiling children)

People

Artist: Milton Rogovin, American 1909 - 2011

Date

1967, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Philip Greider, 2011.588

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Clothing 99.6
Apparel 99.6
Human 99.6
Person 99.6
Person 99.5
Person 99.5
Person 99.4
Person 99
Shorts 93.4
People 90.4
Female 86.7
Pants 74.4
Indoors 72.5
Furniture 70.5
Skirt 69.8
Woman 69.8
Shelf 64.8
Room 64.1
Family 59
Bookcase 58.8
Child 57.1
Kid 57.1
Girl 57
Denim 56.1
Jeans 56.1

Clarifai
created on 2018-03-23

people 100
group 99.6
adult 99.1
child 99
man 98
wear 97.9
woman 97
group together 96.8
two 95.5
portrait 94.4
recreation 93.3
several 92.2
sit 91.4
boy 90.8
monochrome 90.3
administration 89.9
one 89.4
many 89.1
three 88.7
music 88.1

Imagga
created on 2018-03-23

kin 36.3
man 25
person 24.6
people 24
male 23.9
room 20.8
adult 19.9
classroom 18.8
black 17.6
portrait 16.2
silhouette 14.1
happy 13.8
party 12.9
couple 12.2
boy 12.2
sport 11.6
posing 11.5
business 11.5
businessman 11.5
men 11.2
dance 11
dark 10.8
style 10.4
grunge 10.2
smiling 10.1
teen 10.1
art 10
dress 9.9
human 9.7
casual 9.3
teenager 9.1
night 8.9
sexy 8.8
performer 8.8
standing 8.7
dancing 8.7
move 8.6
youth 8.5
dancer 8.5
pretty 8.4
clothing 8.4
active 8.2
child 8.1
new 8.1
team 8.1
success 8
family 8
cool 8
celebration 8
women 7.9
love 7.9
face 7.8
hip 7.8
play 7.7
singer 7.7
modern 7.7
attractive 7.7
expression 7.7
vintage 7.6
head 7.6
fashion 7.5
player 7.5
world 7.4
holding 7.4
action 7.4
musician 7.3
lady 7.3
group 7.2
lifestyle 7.2
music 7.2

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 98.2
posing 65.8
group 62.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 98.5%
Sad 33.6%
Angry 8.8%
Disgusted 5.1%
Surprised 4.5%
Calm 28.6%
Happy 3.2%
Confused 16.2%

AWS Rekognition

Age 23-38
Gender Male, 84.5%
Disgusted 3.2%
Angry 2.7%
Calm 0.6%
Sad 3.4%
Happy 83.9%
Confused 3%
Surprised 3.2%

AWS Rekognition

Age 10-15
Gender Female, 50.2%
Happy 6.3%
Disgusted 3%
Surprised 2.2%
Calm 3.5%
Angry 4.5%
Sad 77.7%
Confused 2.8%

AWS Rekognition

Age 23-38
Gender Female, 85.8%
Angry 3%
Sad 56.6%
Calm 0.9%
Disgusted 3.3%
Happy 31.3%
Surprised 2.5%
Confused 2.5%

AWS Rekognition

Age 4-9
Gender Female, 70.8%
Sad 81.5%
Surprised 0.8%
Confused 0.9%
Happy 13.5%
Calm 0.8%
Disgusted 1%
Angry 1.5%

Microsoft Cognitive Services

Age 42
Gender Male

Microsoft Cognitive Services

Age 68
Gender Male

Microsoft Cognitive Services

Age 46
Gender Male

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Imagga

people portraits 99.3%
paintings art 0.6%