Human Generated Data

Title

Untitled (couple seated on lounge chairs near ocean)

Date

c. 1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10460

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple seated on lounge chairs near ocean)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10460

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.2
Human 99.2
Person 91.8
Tent 90.3
Clothing 87.9
Apparel 87.9
Furniture 73.5
Photo Booth 71.7
Female 65.3

Clarifai
created on 2023-10-26

people 99.8
adult 98.7
woman 98.4
wear 97.5
man 97.2
dress 92.9
group 92.5
two 92.1
wedding 90.9
family 88.8
three 87.9
child 87
veil 86.9
portrait 86.9
outfit 86.1
administration 85
art 83.6
actress 83
retro 77.9
indoors 77.1

Imagga
created on 2022-01-09

people 24.5
adult 22
man 18.8
bride 17.2
dress 17.1
happy 16.9
couple 16.5
portrait 16.2
window 15.4
person 14.9
groom 14.9
male 14.9
wedding 14.7
stage 13
fashion 12.8
love 12.6
pretty 12.6
happiness 11.7
lifestyle 11.6
family 11.6
black 11.5
modern 11.2
women 11.1
boutique 10.9
clothing 10.7
attractive 10.5
sexy 10.4
wife 10.4
style 10.4
hair 10.3
smile 10
business 9.7
canvas tent 9.4
billboard 9.2
sensuality 9.1
platform 9
outdoors 8.9
lady 8.9
interior 8.8
looking 8.8
smiling 8.7
model 8.5
marriage 8.5
mother 8.5
two 8.5
elegance 8.4
new 8.1
light 8
indoors 7.9
face 7.8
color 7.8
room 7.7
sitting 7.7
married 7.7
passenger 7.6
casual 7.6
relaxation 7.5
human 7.5
leisure 7.5
car 7.5
signboard 7.4
holding 7.4
body 7.2
home 7.2
celebration 7.2
holiday 7.2
romantic 7.1
child 7.1
together 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 97
clothing 76.2
person 73.4
black and white 69.8
sketch 60.6
posing 49.3
clothes 20.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 90.1%
Sad 82.2%
Happy 10.9%
Calm 5.6%
Fear 0.6%
Disgusted 0.3%
Confused 0.2%
Surprised 0.2%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Tent 90.3%

Categories

Captions

Microsoft
created on 2022-01-09

a person posing for a photo 52%

Text analysis

Amazon

44419
NAGOY
Mua-
Mua- YT27482
YT27482

Google

44 *19
44
*19