Human Generated Data

Title

Untitled (formally dressed man and woman dancing)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5652

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (formally dressed man and woman dancing)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5652

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 98.9
Human 98.9
Furniture 98.9
Person 98.8
Clothing 97.3
Apparel 97.3
Person 97.3
Chair 96.6
Shoe 95.8
Footwear 95.8
Shoe 91.4
Person 90.4
Face 87.2
Text 80.7
Suit 80.2
Coat 80.2
Overcoat 80.2
Female 76.6
Robe 69.3
Fashion 69.3
Portrait 68.5
Photography 68.5
Photo 68.5
People 67.7
Girl 64.3
Sleeve 64.2
Sunglasses 63.6
Accessories 63.6
Accessory 63.6
Person 62.7
Sunglasses 62.5
Dress 61.1
Gown 58.9
Outdoors 58.4
Woman 58.4
Bridegroom 58.2
Wedding 58.2
Sport 55.4
Sports 55.4
Pants 55.1

Clarifai
created on 2023-10-15

people 99.1
man 98.5
adult 95.5
wedding 93.4
woman 93.4
wear 91.1
group 90.9
monochrome 83.1
group together 81.9
ceremony 78.2
groom 77.1
two 76.3
veil 76.2
many 73.7
street 73.6
chair 73.1
bride 68.7
fashion 64.5
dinner jacket 64.4
indoors 64.2

Imagga
created on 2021-12-15

person 21.7
people 20.6
adult 19
clothing 18.8
portrait 18.1
negative 15.5
man 15.4
dress 15.3
model 14.8
fashion 14.3
male 14.2
indoors 13.2
film 12.9
face 12.8
black 12.7
lady 11.4
human 11.2
attractive 11.2
men 11.2
couple 10.4
sexy 10.4
art 10.4
women 10.3
love 10.3
lifestyle 10.1
patient 10.1
pretty 9.8
posing 9.8
health 9.7
medical 9.7
style 9.6
equipment 9.5
boutique 9.4
elegance 9.2
photographic paper 9.2
wedding 9.2
bride 8.6
traditional 8.3
garment 8.3
nurse 8.2
pose 8.1
mask 8.1
body 8
celebration 8
smiling 8
sculpture 7.9
medicine 7.9
business 7.9
cute 7.9
luxury 7.7
elegant 7.7
room 7.5
clothes 7.5
inside 7.4
decoration 7.3
hospital 7.3
looking 7.2
religion 7.2
team 7.2
romantic 7.1
family 7.1
interior 7.1
happiness 7
together 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.4
person 97.1
chair 83.1
furniture 74.3
clothing 72.2
old 71.6
player 69.9
wedding 62
sketch 55.3
posing 51.2
drawing 50

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Male, 89.6%
Calm 79.1%
Sad 15.6%
Angry 3.8%
Happy 0.4%
Surprised 0.4%
Confused 0.3%
Fear 0.2%
Disgusted 0.1%

AWS Rekognition

Age 22-34
Gender Female, 90.5%
Sad 80.3%
Calm 18.8%
Angry 0.4%
Confused 0.2%
Fear 0.1%
Happy 0.1%
Surprised 0.1%
Disgusted 0%

AWS Rekognition

Age 24-38
Gender Male, 92.9%
Calm 97.4%
Surprised 2%
Sad 0.2%
Angry 0.2%
Happy 0.1%
Confused 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 20-32
Gender Female, 53%
Calm 46.5%
Happy 38.1%
Sad 7.3%
Fear 3.6%
Angry 2%
Surprised 1.4%
Confused 0.7%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Chair 96.6%
Shoe 95.8%
Sunglasses 63.6%

Categories

Imagga

paintings art 99.9%

Text analysis

Amazon

14606
14606.
VEEV
14606. .PENNOCK+BUCK 'O'NEILL.
.PENNOCK+BUCK
VEEV ЭЛЬЕВЬИИ ЬВЕ22
'O'NEILL.
ЬВЕ22
ЭЛЬЕВЬИИ

Google

bBE22
14606.PENNOCK
t
BUCK'O'NEILL.
14606 veEV 20BEBBVM bBE22 14606.PENNOCK t BUCK'O'NEILL.
14606
veEV
20BEBBVM