Human Generated Data

Title

Untitled (couple posing on beach with adirondack chairs)

Date

c. 1970

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11587

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple posing on beach with adirondack chairs)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Shorts 99.8
Clothing 99.8
Apparel 99.8
Person 99.6
Human 99.6
Person 99.4
Person 98
Person 97.3
Person 93.6
Female 90.7
Dress 86.6
Person 81.8
Face 80.9
Outdoors 80.6
Tree 79
Plant 79
Nature 76.8
Person 76.4
Grass 75.3
Woman 72.9
Shirt 70.8
Girl 67.7
Pants 66.9
Photography 65.9
Photo 65.9
Portrait 65.9
Kid 65.3
Child 65.3
Person 61
Furniture 60.7
Bench 60.7
Food 59.8
Cream 59.8
Icing 59.8
Dessert 59.8
Cake 59.8
Creme 59.8
City 57.5
Urban 57.5
Building 57.5
Town 57.5
Bench 56.8
Man 56.2

Imagga
created on 2022-01-15

person 19.9
dancer 17.9
people 17.8
silhouette 17.4
sunset 17.1
male 17
man 16.8
performer 16.8
sky 16.6
water 15.3
travel 14.1
sea 14.1
beach 12.8
ocean 12.4
men 12
outdoors 11.9
entertainer 11.9
stage 11.2
love 11
couple 10.4
outdoor 9.9
adult 9.9
night 9.8
old 9.7
teacher 9.7
happiness 9.4
two 9.3
sport 9.2
summer 9
world 9
landscape 8.9
weapon 8.8
tropical 8.5
waiter 8.4
dark 8.3
wedding 8.3
platform 8.3
art 8.2
light 8.2
history 8
dance 8
life 8
women 7.9
black 7.8
scene 7.8
bride 7.7
musical instrument 7.7
dusk 7.6
human 7.5
professional 7.5
city 7.5
evening 7.5
tourism 7.4
tourist 7.4
island 7.3
protection 7.3
danger 7.3
group 7.3
sand 7.2
architecture 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.9
wedding 86.2
old 82.4
window 80.4
standing 80.4
black 66.5
ceremony 58.2
black and white 53.3
posing 36.5

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Female, 83.4%
Happy 99.6%
Calm 0.2%
Surprised 0.1%
Sad 0.1%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 38-46
Gender Male, 80.2%
Calm 62.5%
Happy 29.5%
Surprised 3.6%
Sad 1.2%
Disgusted 1.2%
Fear 0.8%
Confused 0.7%
Angry 0.6%

AWS Rekognition

Age 31-41
Gender Female, 88.4%
Calm 52.7%
Fear 33.1%
Happy 7.6%
Sad 1.6%
Disgusted 1.6%
Angry 1.3%
Surprised 1.2%
Confused 0.8%

AWS Rekognition

Age 34-42
Gender Male, 97.8%
Happy 64.8%
Calm 22.2%
Sad 5.3%
Fear 3.9%
Surprised 1.2%
Angry 1.1%
Confused 0.9%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Bench 60.7%

Captions

Microsoft

an old photo of a group of people standing in front of a window 75.5%
an old photo of a group of people standing in front of a building 75.4%
old photo of a group of people standing in front of a window 73.9%

Text analysis

Amazon

KAI
ALOHA KAI
ALOHA
ALOHA K.J.
K.J.

Google

KAI
ALOHA KAI ALDHA KAI
ALOHA
ALDHA