Human Generated Data

Title

Untitled (family with three small children and baby sitting on living room couch)

Date

1937

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9121

Human Generated Data

Title

Untitled (family with three small children and baby sitting on living room couch)

People

Artist: Martin Schweig, American 20th century

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.1
Human 99.1
Person 98.6
Person 98.3
Person 94.5
Person 91.6
Clothing 90.6
Apparel 90.6
Poster 79.1
Advertisement 79.1
Person 77.2
Art 77
People 74.6
Face 65.4
Portrait 62.4
Photography 62.4
Photo 62.4
Clinic 57

Imagga
created on 2022-01-23

barbershop 36.5
shop 29.6
person 27.2
man 26.9
people 23.4
mercantile establishment 22.7
male 19.8
athlete 15.7
place of business 15.2
room 15
player 14.8
kin 14.7
adult 14.6
world 14.4
sport 14.1
men 13.7
happiness 13.3
black 13.2
portrait 12.3
happy 11.9
family 11.6
lifestyle 11.6
statue 11.4
ballplayer 10.6
fun 10.5
love 10.3
nurse 10.1
smiling 10.1
art 9.9
sculpture 9.7
couple 9.6
home 9.6
smile 9.3
business 9.1
fashion 9
human 9
teacher 9
one 9
cheerful 8.9
sexy 8.8
businessman 8.8
motion 8.6
silhouette 8.3
holding 8.2
exercise 8.2
style 8.2
body 8
interior 8
urban 7.9
marble 7.8
play 7.7
mother 7.7
old 7.7
youth 7.7
window 7.7
establishment 7.6
enjoyment 7.5
city 7.5
contestant 7.5
professional 7.4
freedom 7.3
new 7.3
women 7.1
face 7.1
architecture 7
modern 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.6
clothing 90.3
person 86.7
old 84.3
drawing 76
man 70.1
human face 63.9
sketch 57.6
posing 43.3
image 43.2

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Female, 93.1%
Calm 54.3%
Sad 39.7%
Confused 1.3%
Angry 1.2%
Surprised 1.2%
Fear 1.1%
Disgusted 0.6%
Happy 0.6%

AWS Rekognition

Age 24-34
Gender Male, 99.5%
Happy 51.3%
Calm 38.8%
Surprised 3.1%
Disgusted 2.2%
Angry 1.4%
Fear 1.2%
Confused 1%
Sad 1%

AWS Rekognition

Age 23-31
Gender Female, 82.3%
Calm 99.6%
Sad 0.2%
Surprised 0.1%
Confused 0%
Disgusted 0%
Fear 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 6-12
Gender Female, 70%
Happy 57.8%
Calm 39%
Surprised 1.9%
Angry 0.4%
Disgusted 0.4%
Sad 0.3%
Fear 0.2%
Confused 0.1%

AWS Rekognition

Age 24-34
Gender Female, 99.3%
Calm 88.1%
Happy 10.4%
Sad 0.5%
Surprised 0.3%
Confused 0.2%
Fear 0.2%
Disgusted 0.2%
Angry 0.1%

AWS Rekognition

Age 28-38
Gender Male, 80.9%
Happy 87%
Calm 5%
Sad 3.7%
Surprised 3%
Disgusted 0.4%
Confused 0.4%
Fear 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Poster 79.1%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 83%
a vintage photo of a group of people posing for a picture 82.9%
a group of people posing for a photo 82.8%

Text analysis

Amazon

MJI3
ARDA
MJI3 ЭТАЯТІЙ ARDA
ЭТАЯТІЙ
CHUSE