Human Generated Data

Title

Untitled (throwing rice)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.429.22

Human Generated Data

Title

Untitled (throwing rice)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.429.22

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Human 99.4
Person 99.4
Person 98.4
Person 97.2
Clothing 96.1
Apparel 96.1
Person 95.6
Person 93.9
Tie 84.7
Accessory 84.7
Accessories 84.7
Overcoat 79.6
Coat 79.6
Suit 76.5
Face 75.7
Person 73.3
Leisure Activities 55.7
People 55.6
Person 55.3
Person 52.2

Clarifai
created on 2019-03-25

people 99.9
group 99.6
group together 98.9
adult 97.7
man 96.9
woman 96.5
administration 96.3
several 94.4
five 91.5
four 91.1
leader 90.1
actor 87.2
military 85.9
three 85.8
outfit 85.7
movie 84.5
indoors 84.5
war 81.2
portrait 80.6
many 80.2

Imagga
created on 2019-03-25

man 33.6
people 27.3
male 25.5
room 24.9
adult 23.8
business 23.7
person 19.7
musical instrument 17.8
businessman 16.8
men 16.3
office 15.7
classroom 15.3
corporate 14.6
businesspeople 14.2
black 13.8
looking 13.6
executive 13.5
city 13.3
wind instrument 13.3
indoors 13.2
sitting 12.9
face 12.8
casual 12.7
old 12.5
building 12.5
urban 12.2
professional 12.2
computer 12.1
fashion 12.1
indoor 11.9
lifestyle 11.6
group 11.3
clothing 11.1
device 10.8
handsome 10.7
shop 10.6
formal 10.5
stringed instrument 10.4
women 10.3
two 10.2
suit 10.1
inside 10.1
businesswoman 10
hand 9.9
modern 9.8
interior 9.7
portrait 9.7
couple 9.6
work 9.5
females 9.5
desk 9.4
happiness 9.4
chair 9.3
sax 9.1
dress 9
color 8.9
happy 8.8
architecture 8.6
attractive 8.4
laptop 8.4
sexy 8
barbershop 8
family 8
jacket 8
job 8
life 7.7
pretty 7.7
career 7.6
meeting 7.5
window 7.5
mature 7.4
washboard 7.4
music 7.3
alone 7.3
confident 7.3
art 7.3
pose 7.2
home 7.2
team 7.2
history 7.2
love 7.1
mercantile establishment 7

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

person 99.7
newspaper 92.5
group 83.8
people 83.1
posing 47.5
old 45.5
clothes 30.7
black and white 2.7
retro 2.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 38-59
Gender Male, 88.4%
Sad 1.2%
Confused 1.5%
Angry 1.5%
Surprised 1.9%
Calm 5.2%
Happy 87.3%
Disgusted 1.4%

AWS Rekognition

Age 35-52
Gender Female, 55%
Happy 54.9%
Surprised 45%
Sad 45.1%
Confused 45%
Angry 45%
Disgusted 45%
Calm 45%

AWS Rekognition

Age 29-45
Gender Male, 98.1%
Sad 2.3%
Calm 4.1%
Happy 5.6%
Surprised 10.2%
Angry 23.8%
Disgusted 49.6%
Confused 4.3%

AWS Rekognition

Age 35-52
Gender Male, 53.8%
Confused 45.3%
Sad 47.2%
Angry 45.3%
Happy 47.3%
Calm 49.2%
Disgusted 45.2%
Surprised 45.5%

AWS Rekognition

Age 35-52
Gender Male, 86.7%
Happy 79.5%
Confused 3%
Disgusted 3.6%
Angry 3.6%
Calm 2.2%
Surprised 4.2%
Sad 4%

AWS Rekognition

Age 26-43
Gender Male, 50.3%
Calm 47%
Happy 45.2%
Confused 45.1%
Disgusted 46.7%
Sad 45.1%
Surprised 45.4%
Angry 50.6%

AWS Rekognition

Age 14-25
Gender Male, 62.9%
Calm 5.5%
Disgusted 2.3%
Confused 2.7%
Happy 0.6%
Surprised 1.5%
Angry 41.7%
Sad 45.7%

AWS Rekognition

Age 35-52
Gender Female, 53.4%
Angry 45.3%
Calm 50.5%
Happy 46.1%
Surprised 45.2%
Sad 47.6%
Disgusted 45.2%
Confused 45.1%

AWS Rekognition

Age 26-43
Gender Female, 53%
Surprised 45.4%
Angry 45.3%
Sad 49.5%
Confused 45.3%
Calm 46.9%
Happy 47.4%
Disgusted 45.2%

AWS Rekognition

Age 20-38
Gender Female, 50.2%
Sad 51.3%
Calm 46.6%
Happy 45.2%
Surprised 45.3%
Angry 45.7%
Disgusted 45.7%
Confused 45.2%

AWS Rekognition

Age 35-52
Gender Female, 50.8%
Disgusted 45%
Calm 45.1%
Sad 45.3%
Confused 45%
Happy 54.3%
Surprised 45.1%
Angry 45.2%

AWS Rekognition

Age 38-57
Gender Male, 50.9%
Happy 45.7%
Sad 46.4%
Surprised 45.1%
Disgusted 45.1%
Angry 45.5%
Calm 52%
Confused 45.2%

Microsoft Cognitive Services

Age 69
Gender Male

Microsoft Cognitive Services

Age 65
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 38
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Tie 84.7%
Suit 76.5%

Categories

Imagga

paintings art 88.9%
people portraits 9.4%