Human Generated Data

Title

Untitled (eleven people posing together in room decorated for Christmas)

Date

1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9529

Human Generated Data

Title

Untitled (eleven people posing together in room decorated for Christmas)

People

Artist: Martin Schweig, American 20th century

Date

1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Human 99.7
Person 99.7
Person 99.4
Person 99.2
Person 99.2
Clothing 98.9
Apparel 98.9
Person 98.6
Person 98.6
Person 95.6
Person 91.5
Person 90.7
Person 89.1
Furniture 86.7
Face 84.6
Person 81.1
Female 80.6
Dress 77.7
People 77
Shoe 71.1
Footwear 71.1
Suit 67.1
Coat 67.1
Overcoat 67.1
Girl 63.8
Woman 63.6
Fashion 63.4
Robe 63.4
Chair 62
Photo 61.4
Photography 61.4
Stage 59.3
Gown 58.9
Steamer 57.8
Evening Dress 57.6

Imagga
created on 2022-01-28

people 34.5
man 34.2
couple 27
brass 26.6
person 26.5
kin 25.3
male 24.1
wind instrument 23.9
happy 22.5
group 21.7
businessman 21.2
business 20
adult 18.6
musical instrument 18.1
smiling 18.1
portrait 17.4
trombone 17
groom 16.4
men 16.3
family 16
bride 15.6
two 15.2
love 15
together 14.9
happiness 14.9
women 14.2
professional 13.2
team 12.5
friends 12.2
standing 12.2
wedding 11.9
holiday 11.5
golfer 11.4
friendship 11.2
room 11
casual 11
occupation 11
suit 10.8
handsome 10.7
fashion 10.5
human 10.5
fun 10.5
office 10.4
black 10.2
smile 10
player 9.9
summer 9.6
corporate 9.4
bouquet 9.4
active 9.4
day 9.4
lifestyle 9.4
life 9.2
girls 9.1
silhouette 9.1
businesswoman 9.1
contestant 9
sky 8.9
job 8.8
boy 8.7
crowd 8.6
walking 8.5
holding 8.2
park 8.2
outdoors 8.2
dress 8.1
success 8
face 7.8
gymnasium 7.8
cornet 7.7
nurse 7.7
married 7.7
hand 7.6
meeting 7.5
joy 7.5
blackboard 7.4
camera 7.4
world 7.4
worker 7.3
cheerful 7.3
looking 7.2
child 7.1
posing 7.1
working 7.1
work 7.1
sea 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

person 97.3
dress 97.3
clothing 97.1
text 93.9
standing 92.3
woman 92.3
old 88.9
wedding dress 86.6
bride 75.8
posing 75.8
man 75.4
black 72.2
white 66.9
player 62.1
smile 61.6

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 97.7%
Calm 68%
Happy 26.4%
Surprised 4.1%
Sad 0.7%
Confused 0.3%
Disgusted 0.3%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 34-42
Gender Male, 97.6%
Calm 98.9%
Happy 0.3%
Sad 0.3%
Surprised 0.2%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 48-56
Gender Male, 80.8%
Happy 52.6%
Surprised 41.6%
Calm 2.7%
Sad 1.1%
Disgusted 0.7%
Confused 0.7%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 40-48
Gender Male, 100%
Surprised 55.9%
Calm 30.7%
Sad 4%
Confused 2.8%
Disgusted 2.6%
Happy 1.8%
Fear 1.2%
Angry 1.1%

AWS Rekognition

Age 43-51
Gender Male, 99.8%
Sad 87.2%
Happy 9.5%
Surprised 1.9%
Calm 0.4%
Fear 0.3%
Confused 0.2%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 37-45
Gender Male, 74.5%
Calm 83.1%
Happy 6.9%
Angry 3%
Confused 2%
Surprised 1.8%
Sad 1.4%
Disgusted 1.3%
Fear 0.5%

AWS Rekognition

Age 42-50
Gender Male, 93.5%
Happy 76.4%
Sad 11.9%
Confused 5.6%
Calm 2.6%
Disgusted 1.2%
Fear 1.1%
Surprised 0.7%
Angry 0.6%

AWS Rekognition

Age 47-53
Gender Male, 80%
Happy 88.2%
Calm 7.8%
Confused 1.7%
Sad 0.7%
Surprised 0.6%
Disgusted 0.4%
Fear 0.4%
Angry 0.3%

AWS Rekognition

Age 33-41
Gender Male, 91.4%
Calm 99.1%
Sad 0.3%
Happy 0.3%
Fear 0.1%
Confused 0.1%
Angry 0.1%
Surprised 0%
Disgusted 0%

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Sad 80.8%
Calm 12.8%
Happy 2.7%
Confused 1.5%
Surprised 0.8%
Angry 0.7%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 41-49
Gender Male, 99.9%
Sad 55.3%
Calm 16.9%
Angry 15%
Confused 4.9%
Surprised 2.9%
Fear 2.3%
Disgusted 1.4%
Happy 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 71.1%
Chair 62%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 94%
a vintage photo of a group of people posing for a picture 93.9%
a group of people posing for a photo 93.8%

Text analysis

Amazon

LIE
10131
KODAKA-ITW