Human Generated Data

Title

Untitled (two couples posing in living room at Christmas party)

Date

1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9534

Human Generated Data

Title

Untitled (two couples posing in living room at Christmas party)

People

Artist: Martin Schweig, American 20th century

Date

1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 98.4
Human 98.4
Apparel 97.8
Clothing 97.8
Person 97.7
Person 96.8
Person 96.8
Person 95
Accessory 91.8
Tie 91.8
Accessories 91.8
Coat 91.5
Suit 91.5
Overcoat 91.5
Tie 91.4
Person 91.3
Gown 73.3
Wedding 73.3
Wedding Gown 73.3
Fashion 73.3
Robe 73.3
Musician 66.6
Musical Instrument 66.6
People 66.6
Face 56.7
Stage 56.3
Crowd 55.7
Shirt 55.2

Imagga
created on 2022-01-28

wind instrument 32.8
people 29
man 28.9
male 28.3
brass 27.3
business 24.9
businessman 22.9
group 22.5
musical instrument 22.3
person 22.3
sax 19.5
adult 17.9
men 16.3
trombone 15.2
couple 14.8
clothing 14.5
silhouette 14.1
black 13.8
office 13.6
work 13.3
suit 12.1
women 11.9
stage 11.5
corporate 11.2
professional 11.1
device 10.9
handsome 10.7
job 10.6
crowd 10.6
room 10.4
musician 10.4
meeting 10.4
music 10.2
cornet 10.1
oboe 10.1
fashion 9.8
family 9.8
executive 9.7
concert 9.7
success 9.6
happy 9.4
lifestyle 9.4
manager 9.3
portrait 9.1
dress 9
team 8.9
together 8.8
standing 8.7
finance 8.4
singer 8.2
style 8.2
building 7.9
guy 7.8
play 7.7
outfit 7.7
modern 7.7
money 7.6
two 7.6
human 7.5
clothes 7.5
fun 7.5
holding 7.4
guitar 7.4
microphone 7.4
entertainment 7.4
looking 7.2
night 7.1
interior 7.1
indoors 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

person 99.7
standing 97.3
clothing 93.5
drawing 92.4
text 92.3
people 86.4
sketch 84.9
group 79.5
posing 73
woman 65.5
dress 64.6

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 93.4%
Calm 98%
Happy 0.9%
Surprised 0.5%
Sad 0.2%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 43-51
Gender Female, 82.8%
Happy 76.5%
Calm 22%
Confused 0.4%
Disgusted 0.3%
Fear 0.3%
Sad 0.2%
Angry 0.2%
Surprised 0.2%

AWS Rekognition

Age 38-46
Gender Male, 100%
Calm 99.7%
Happy 0.2%
Sad 0%
Confused 0%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 50-58
Gender Male, 92.9%
Happy 98%
Calm 1.6%
Surprised 0.1%
Sad 0.1%
Confused 0.1%
Angry 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%
Tie 91.8%
Suit 91.5%
Wedding Gown 73.3%

Captions

Microsoft

a group of people posing for a photo 98.8%
a group of people posing for the camera 98.7%
a group of people posing for a picture 98.6%

Text analysis

Amazon

10131

Google

-
MJ17--YT37A°2
XAGO
MJ17--YT37A°2 - - XAGO