Human Generated Data

Title

Untitled ("players" raising their glasses on a stage set)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10702

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled ("players" raising their glasses on a stage set)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10702

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Clothing 99.5
Apparel 99.5
Person 97.9
Human 97.9
Person 97.1
Person 89.1
Indoors 85.3
Sleeve 84.2
Room 84.1
Female 77.5
Floor 72.9
Woman 67.7
Evening Dress 67.6
Fashion 67.6
Gown 67.6
Robe 67.6
Dress 66.7
Long Sleeve 64.2
Dressing Room 62.4
Flooring 60.1
Mannequin 56.3
Art 56

Clarifai
created on 2023-10-26

people 99.9
adult 98.8
group 98.6
woman 98.2
wear 97.5
two 96.3
three 95.3
man 94.8
monochrome 90.5
group together 89.8
four 89.8
commerce 87.8
several 86.2
stock 85.5
actress 85.4
administration 83.1
child 83
outfit 82.3
family 81.9
actor 80.6

Imagga
created on 2022-01-15

boutique 38.6
people 35.7
business 25.5
man 24.2
adult 20.9
women 19.8
clothing 19.2
fashion 18.8
person 17.7
men 17.2
dress 17.2
group 16.9
male 15.6
shop 15.2
happy 15
interior 15
clothes 15
modern 14
urban 14
professional 13.9
businessman 13.2
corporate 12.9
office 12.8
attractive 12.6
happiness 12.5
city 12.5
window 12.1
building 12
pretty 11.9
gown 11.6
meeting 11.3
portrait 11
mall 10.7
life 10.2
work 10.2
shopping 10.2
suit 10.1
smile 10
bags 9.7
human 9.7
bride 9.6
couple 9.6
teacher 9.5
two 9.3
outfit 9.3
elegance 9.2
room 9.1
groom 9
team 9
indoors 8.8
businesspeople 8.5
adults 8.5
store 8.5
travel 8.4
old 8.4
church 8.3
wedding 8.3
lady 8.1
new 8.1
family 8
home 8
black 7.9
standing 7.8
hands 7.8
scene 7.8
glass 7.8
model 7.8
salon 7.6
casual 7.6
walking 7.6
legs 7.5
hairdresser 7.5
style 7.4
sale 7.4
inside 7.4
light 7.3
successful 7.3
indoor 7.3
businesswoman 7.3
success 7.2
smiling 7.2
lifestyle 7.2
looking 7.2
hair 7.1
day 7.1
architecture 7
together 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

person 98
drawing 96.8
clothing 93.7
text 93
sketch 92.9
dress 88.7
woman 80.4
painting 70.7
cartoon 58.2
footwear 55.2
clothes 15.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 99.7%
Calm 56.3%
Happy 25%
Surprised 14%
Confused 2.4%
Sad 0.9%
Disgusted 0.6%
Angry 0.5%
Fear 0.3%

AWS Rekognition

Age 49-57
Gender Female, 89.8%
Sad 50.6%
Calm 21.6%
Surprised 12.1%
Happy 7.5%
Fear 3.8%
Disgusted 1.8%
Confused 1.5%
Angry 1.1%

AWS Rekognition

Age 27-37
Gender Female, 98.6%
Calm 99.3%
Fear 0.2%
Happy 0.2%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%
Sad 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%

Categories

Text analysis

Amazon

EORSE
MODER-SVEELA