Human Generated Data

Title

Untitled (five Junior League women posing dramatically in carpeted room)

Date

1940-1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10038

Human Generated Data

Title

Untitled (five Junior League women posing dramatically in carpeted room)

People

Artist: Martin Schweig, American 20th century

Date

1940-1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.7
Human 99.7
Person 99.5
Person 99.3
Apparel 99.1
Clothing 99.1
Person 99
Person 98.8
Shorts 92.4
Female 89.6
Woman 73.2
People 71
Sleeve 68.9
Suit 64.1
Overcoat 64.1
Coat 64.1
Girl 62.6
Door 59.6
Long Sleeve 56

Imagga
created on 2022-01-28

brass 63.6
wind instrument 50
musical instrument 33.5
people 26.2
cornet 26.1
person 25.1
silhouette 23.2
black 21
adult 20.4
sexy 20.1
fashion 19.6
attractive 18.2
model 17.1
sport 16.7
horn 16.5
man 16.1
body 16
device 15.8
male 15.6
portrait 14.9
posing 14.2
style 14.1
lady 13.8
athlete 13.3
human 12.7
exercise 12.7
dress 12.6
pretty 12.6
sunset 12.6
dark 12.5
dancer 11.6
lifestyle 11.6
group 11.3
hair 11.1
women 11.1
active 10.9
pose 10.9
dance 10.3
men 10.3
cute 10
gorgeous 10
player 9.9
fitness 9.9
studio 9.9
together 9.6
elegant 9.4
instrumentality 9.2
sensual 9.1
design 9
trombone 8.8
glamorous 8.7
casual 8.5
skin 8.5
friendship 8.4
runner 8.4
event 8.3
teenager 8.2
figure 8.1
clothing 8.1
performer 8
run 7.7
summer 7.7
dancing 7.7
crowd 7.7
expression 7.7
grunge 7.7
hand 7.6
fashionable 7.6
happy 7.5
planner 7.5
symbol 7.4
competition 7.3
stylish 7.2
team 7.2
bright 7.1
face 7.1
businessman 7.1

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

text 97.9
dance 79.7
player 77.5
posing 50.1
female 46.2

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 76.1%
Happy 37.2%
Calm 21.4%
Surprised 20.4%
Sad 13.3%
Fear 2.6%
Disgusted 1.9%
Confused 1.8%
Angry 1.3%

AWS Rekognition

Age 49-57
Gender Male, 94.5%
Happy 89.3%
Sad 7.8%
Confused 0.8%
Calm 0.8%
Surprised 0.6%
Fear 0.2%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 47-53
Gender Male, 99.7%
Happy 52.7%
Sad 43.8%
Calm 1.1%
Angry 0.7%
Confused 0.6%
Disgusted 0.5%
Surprised 0.4%
Fear 0.2%

AWS Rekognition

Age 30-40
Gender Male, 97.6%
Happy 76.8%
Confused 9%
Sad 5.8%
Calm 5.3%
Surprised 1.5%
Disgusted 0.8%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 43-51
Gender Male, 93.2%
Happy 54.3%
Calm 20.1%
Sad 15.1%
Surprised 4.8%
Confused 1.9%
Fear 1.4%
Disgusted 1.3%
Angry 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people posing for a photo 78.6%
a group of people posing for the camera 78.5%
a group of people posing for a picture 78.4%

Text analysis

Amazon

KAGOK
MINT KAGOK
MINT