Human Generated Data

Title

Untitled (woman putting salad on set table)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7357

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman putting salad on set table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7357

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.6
Human 99.6
Clothing 86.6
Apparel 86.6
Furniture 85.3
Table 83
Dining Table 78.4
Face 78.3
Food 77.2
Meal 77.2
Car 75.3
Vehicle 75.3
Transportation 75.3
Automobile 75.3
Home Decor 74.6
Glass 71.9
Female 71.4
Text 70.8
Glasses 69.9
Accessory 69.9
Accessories 69.9
Beverage 67.3
Drink 67.3
Portrait 65.6
Photography 65.6
Photo 65.6
Advertisement 58.8
Dish 58.3
Alcohol 57.1
Woman 56.8
Linen 55.3
Car 52.2

Clarifai
created on 2023-10-26

people 98.7
monochrome 97.8
adult 94.5
retro 94.1
vintage 93.5
nostalgia 89.8
party 88.2
indoors 88.2
restaurant 85.2
man 85
vertical 84.1
wine 81.5
woman 80.1
art 80
portrait 80
style 78.9
bar 78
antique 76.2
furniture 75.7
black and white 75.4

Imagga
created on 2022-01-15

bartender 100
glass 32.4
wine 29.3
party 22.3
table 21.7
alcohol 18.1
celebration 17.5
restaurant 17.1
black 16.9
drink 16.7
glasses 15.7
wineglass 15.1
dinner 13.7
bottle 12.7
person 12.7
man 12.2
luxury 12
love 11.8
male 11.3
elegant 11.1
wedding 11
dining 10.5
beverage 10.3
people 10
happy 10
music 10
champagne 9.9
lady 9.7
interior 9.7
waiter 9.7
toast 9.7
sexy 9.6
setting 9.6
lifestyle 9.4
rose 9.4
light 9.3
plate 9.3
bar 9.2
food 9.2
meal 9.1
pretty 9.1
attractive 9.1
wet 8.9
couple 8.7
decoration 8.7
men 8.6
adult 8.5
portrait 8.4
modern 8.4
entertainment 8.3
fun 8.2
romance 8
musician 7.9
liquid 7.9
smile 7.8
reception 7.8
standing 7.8
hotel 7.6
crystal 7.5
holding 7.4
service 7.4
water 7.3
container 7.3
home 7.2
romantic 7.1
indoors 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.7
person 94.2
food 90
clothing 89.6
black and white 75.2
man 73.6
human face 63.7
fast food 63.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Female, 98.3%
Happy 59.6%
Calm 38.1%
Angry 0.6%
Sad 0.6%
Disgusted 0.3%
Surprised 0.3%
Confused 0.3%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Car 75.3%

Text analysis

Amazon

KEY,
SIESTA
J.
23176
STEINMETZ,
J. J. STEINMETZ, SIESTA KEY, SARASOTA,-FLA:
SARASOTA,-FLA:
RR
23118
ODYKE
ODYKE EVER
EVER

Google

J.
STEINMETZ,
SIESTA
SARASOTA,
FLA:
23176 J. J. STEINMETZ, SIESTA KEY, SARASOTA, FLA:
23176
KEY,