Human Generated Data

Title

Untitled (women doing crafts at dining room table, Christmas table cloth)

Date

1960

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18736

Human Generated Data

Title

Untitled (women doing crafts at dining room table, Christmas table cloth)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.4
Human 99.4
Person 99.4
Person 99.1
Person 99
Person 99
Person 98.6
People 82.7
Game 64.2
Face 59

Imagga
created on 2022-03-05

person 31.3
male 29.1
man 29
people 27.9
blackboard 27.4
businessman 22.9
teacher 20
adult 19.6
education 19
business 18.8
classroom 14.7
human 14.2
work 14.2
job 14.1
student 14
happy 13.8
room 13.3
expression 12.8
black 12.6
silhouette 12.4
boy 12.2
office 12
school 12
portrait 11.6
professional 11.6
hand 11.4
computer 11.4
group 11.3
modern 11.2
looking 11.2
men 11.2
youth 11.1
team 10.7
design 10.7
class 10.6
board 10.1
success 9.7
symbol 9.4
casual 9.3
manager 9.3
laptop 9.3
smile 9.3
sport 9.2
sign 9
idea 8.9
to 8.8
teaching 8.8
desk 8.7
drawing 8.6
college 8.5
film 8.4
indoor 8.2
one 8.2
child 8.1
lady 8.1
home 8
negative 7.9
player 7.8
chalkboard 7.8
diagram 7.7
chart 7.6
gesture 7.6
meeting 7.5
creation 7.5
senior 7.5
suit 7.5
event 7.4
graphic 7.3
pose 7.2
handsome 7.1
love 7.1
interior 7.1
flag 7
book 7
executive 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 95.6
person 93.9
indoor 91.6
music 84.3
human face 74.7
black and white 53.7
clothing 51.1

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 99.4%
Calm 79.9%
Surprised 10%
Sad 3.7%
Happy 2.9%
Angry 1.1%
Confused 1%
Disgusted 0.8%
Fear 0.7%

AWS Rekognition

Age 49-57
Gender Male, 66.7%
Calm 52.6%
Happy 41.4%
Sad 3%
Surprised 0.9%
Angry 0.7%
Confused 0.6%
Disgusted 0.6%
Fear 0.3%

AWS Rekognition

Age 37-45
Gender Female, 72.7%
Calm 97.2%
Happy 1.5%
Surprised 0.4%
Sad 0.2%
Angry 0.2%
Confused 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 41-49
Gender Male, 99.9%
Surprised 98.5%
Calm 1.1%
Happy 0.1%
Angry 0.1%
Fear 0.1%
Sad 0.1%
Confused 0.1%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people sitting at a table 72.9%
a group of people looking at a laptop 63.1%
a group of people sitting in front of a laptop 59.6%