Human Generated Data

Title

Untitled (man and woman posed sitting on couch with baby girl)

Date

1950-1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9475

Human Generated Data

Title

Untitled (man and woman posed sitting on couch with baby girl)

People

Artist: Martin Schweig, American 20th century

Date

1950-1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9475

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.2
Human 99.2
Clothing 98.5
Apparel 98.5
Person 97.7
Person 93.8
Footwear 91.9
Shoe 83.3
Table Lamp 80
Lamp 80
Skateboard 78.1
Sport 78.1
Sports 78.1
People 64.4
Shoe 64.1
Furniture 62.5
Shoe 54.6

Clarifai
created on 2023-10-26

people 99.9
child 98
group 97.6
monochrome 97.6
two 96.7
man 96
woman 95.9
three 95.8
adult 95.8
family 92.2
wear 92.1
actress 91
offspring 90
four 89.4
sibling 89.2
group together 88.1
movie 88
son 87.6
actor 86.9
music 85.4

Imagga
created on 2022-01-23

teacher 41.2
person 35.8
educator 30.7
man 29.5
male 26.2
adult 26.2
people 25.6
professional 25.6
men 18
business 17
businessman 16.8
planner 15.9
newspaper 15.9
sport 14.8
silhouette 14.1
group 13.7
old 12.5
product 12.1
room 11.8
black 11.4
couple 11.3
creation 11.3
office 11.2
portrait 11
team 10.7
human 10.5
one 10.4
drawing 9.4
senior 9.4
world 9.2
hand 9.1
dance 9.1
fashion 9
looking 8.8
happy 8.8
women 8.7
casual 8.5
classroom 8.5
finance 8.4
design 8.4
active 8.3
back 8.3
board 8.3
holding 8.2
financial 8
job 8
clothing 7.8
sitting 7.7
crowd 7.7
youth 7.7
outdoor 7.6
art 7.6
sign 7.5
event 7.4
indoor 7.3
girls 7.3
exercise 7.3
success 7.2
lifestyle 7.2
computer 7.2
suit 7.2
work 7.1

Google
created on 2022-01-23

Footwear 98.1
Glasses 97.9
Photograph 94.2
Sunglasses 90.5
Black 89.5
Fashion 88
Eyewear 87.7
Vision care 87.1
Black-and-white 85.7
Goggles 85.4
Style 84
Adaptation 79.3
Monochrome 74.7
Font 74.4
Snapshot 74.3
Monochrome photography 72.3
Event 68.7
Sitting 68.7
Human leg 68.3
Knee 67.6

Microsoft
created on 2022-01-23

person 99.8
text 98.8
outdoor 96.1
clothing 93.6
footwear 86.4
human face 69.4
smile 64.8
black and white 56.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 76.7%
Sad 84.8%
Calm 7.5%
Happy 3.2%
Confused 1.7%
Disgusted 1.6%
Fear 0.5%
Angry 0.4%
Surprised 0.4%

AWS Rekognition

Age 37-45
Gender Male, 99.9%
Surprised 77.7%
Happy 13.9%
Fear 3.2%
Calm 2.8%
Angry 0.7%
Disgusted 0.6%
Confused 0.6%
Sad 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Skateboard
Person 99.2%
Person 97.7%
Person 93.8%
Shoe 83.3%
Shoe 64.1%
Shoe 54.6%
Skateboard 78.1%

Text analysis

Amazon

as
350
:Tя9
VT77A°-XX

Google

3Se
3Se