Human Generated Data

Title

Untitled (couple posed with their two children near Christmas tree)

Date

1962

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9871

Human Generated Data

Title

Untitled (couple posed with their two children near Christmas tree)

People

Artist: Martin Schweig, American 20th century

Date

1962

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Clothing 99.9
Apparel 99.9
Person 99.7
Human 99.7
Person 99.7
Person 99.7
Dress 99.5
Shorts 98.1
Suit 97.8
Coat 97.8
Overcoat 97.8
Person 97.8
Female 95.7
Chair 95.2
Furniture 95.2
Face 88.6
Accessories 88.1
Tie 88.1
Accessory 88.1
Woman 85.5
Outdoors 78
Helmet 77.3
Tuxedo 76.5
Man 75.3
Plant 75
People 73.7
Jacket 72.3
Blazer 72.3
Skirt 70.1
Portrait 69.2
Photography 69.2
Photo 69.2
Girl 69.2
Pants 66
Shirt 64.3
Glasses 64
Helmet 62.9
Tree 61
Kid 60.8
Child 60.8
Nature 58.4
Standing 57

Imagga
created on 2022-01-28

kin 100
beach 40.5
people 29.6
man 28.9
silhouette 25.7
sand 24.5
male 24.2
summer 23.8
couple 23.5
sea 23.5
walking 20.8
water 20.7
sunset 20.7
vacation 20.5
ocean 19.9
person 19.3
sky 19.1
family 18.7
love 16.6
adult 15.7
happy 15.7
outdoor 15.3
lifestyle 15.2
fun 15
leisure 14.9
together 14
child 13.7
holiday 13.6
sibling 13.6
active 13.5
relax 13.5
coast 13.5
outdoors 13.4
walk 13.3
happiness 13.3
father 13.1
men 12.9
sun 12.9
sport 12.7
women 12.7
travel 12
mother 12
parent 11.9
black 11.4
boy 11.3
sunny 11.2
tropical 11.1
youth 11.1
two 11
run 10.6
group 10.5
shore 10.2
romance 9.8
portrait 9.7
life 9.6
dad 9.6
play 9.5
relationship 9.4
joy 9.2
sunlight 8.9
kid 8.9
businessman 8.8
dusk 8.6
friends 8.5
sunshine 8.4
friendship 8.4
evening 8.4
waves 8.4
world 8.1
romantic 8
body 8
business 7.9
outside 7.7
coastline 7.5
landscape 7.4
light 7.4
back 7.3
girls 7.3
sexy 7.2
looking 7.2

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

person 98.7
outdoor 98.3
text 98.1
clothing 94.4
woman 78.3
standing 76.2
man 73
smile 71
footwear 70.4
posing 39.4

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Female, 90.2%
Happy 99.7%
Calm 0.1%
Confused 0.1%
Surprised 0%
Sad 0%
Disgusted 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 48-56
Gender Male, 98.7%
Sad 78.4%
Calm 13.3%
Confused 4.3%
Disgusted 1.4%
Happy 1%
Surprised 0.9%
Angry 0.5%
Fear 0.1%

AWS Rekognition

Age 23-33
Gender Male, 99.6%
Happy 92%
Surprised 4%
Calm 1.2%
Fear 1.1%
Sad 0.6%
Disgusted 0.5%
Confused 0.4%
Angry 0.2%

AWS Rekognition

Age 33-41
Gender Male, 99.3%
Happy 98.2%
Calm 1%
Confused 0.2%
Surprised 0.2%
Disgusted 0.1%
Sad 0.1%
Fear 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Tie 88.1%
Helmet 77.3%

Captions

Microsoft

a group of people posing for the camera 95.9%
a group of people posing for a picture 95.8%
a group of people posing for a photo 93.9%

Text analysis

Amazon

REPUBLICA