Human Generated Data

Title

Untitled ("Sub Debs" in swim suits sit and stand around table)

Date

1949

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5578

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled ("Sub Debs" in swim suits sit and stand around table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5578

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.6
Human 99.6
Person 99.4
Chair 99
Furniture 99
Person 98.8
Person 98.8
Person 98.4
Person 96.6
Chair 95.7
Clothing 94.7
Apparel 94.7
Person 92.9
People 89.1
Female 84.8
Dress 81.9
Crowd 77.9
Face 75.3
Table 71.9
Woman 68.3
Photography 67.1
Photo 67.1
Portrait 66.8
Girl 65.3
Person 65.1
Tablecloth 63.1
Meal 59.7
Food 59.7
Dining Table 59.1
Stage 57.3

Clarifai
created on 2023-10-15

people 99.1
man 97.2
woman 96.3
illustration 94.7
child 93.9
group 93.6
adult 93.6
sitting 92.4
sit 92.2
retro 92.1
dancing 88.8
fun 87.9
wear 87.4
chair 84.7
desktop 83.6
crowd 82.9
adolescent 82.5
many 82
group together 81.3
enjoyment 81

Imagga
created on 2021-12-15

sketch 52.9
drawing 50.7
representation 26.9
silhouette 24
design 23.1
wagon 21.1
art 21
graphic 19.7
grunge 19.6
business 18.2
wheeled vehicle 16.1
black 14.4
man 14.1
pattern 13.7
human 12
style 11.9
backgrounds 11.4
plan 11.3
people 11.2
clip art 11.1
symbol 10.8
element 10.7
male 10.6
businessman 10.6
office 10.4
construction 10.3
architecture 10.2
facility 10.1
gymnasium 10
container 9.9
team 9.9
retro 9.8
modern 9.8
cartoon 9.8
shape 9.6
line 9.4
house 9.2
sport 9.1
painting 9
urban 8.7
life 8.6
outline 8.5
color 8.3
fashion 8.3
group 8.1
decoration 8
structure 7.8
artistic 7.8
space 7.8
cityscape 7.6
arrow 7.5
city 7.5
paint 7.2
music 7.2
athletic facility 7.2
activity 7.2

Google
created on 2021-12-15

Furniture 93.8
Table 89.9
Chair 89.5
Style 83.9
Black-and-white 83
Art 82.4
Line 81.9
Font 81.1
Adaptation 79.2
Snapshot 74.3
Tent 73.6
Monochrome photography 73.4
Suit 72.8
Event 71.9
Monochrome 71.9
Painting 71.2
Motor vehicle 68.7
Illustration 68
Rectangle 67.1
Stock photography 65.3

Microsoft
created on 2021-12-15

text 99.5
person 95.2
old 87.6
outdoor 85.3
posing 84
chair 83.9
table 79.7
furniture 68.1
people 57.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Female, 62.9%
Calm 45.9%
Happy 33.1%
Sad 18.1%
Angry 1.2%
Confused 1%
Disgusted 0.3%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 18-30
Gender Female, 92.7%
Calm 84.5%
Happy 10.9%
Sad 3.5%
Confused 0.5%
Surprised 0.4%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 37-55
Gender Female, 67.8%
Calm 94.8%
Angry 1.8%
Sad 1.4%
Happy 0.7%
Confused 0.4%
Surprised 0.4%
Disgusted 0.3%
Fear 0.1%

AWS Rekognition

Age 33-49
Gender Male, 87.1%
Calm 95.6%
Sad 2%
Happy 1.4%
Confused 0.5%
Angry 0.3%
Surprised 0.2%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 39-57
Gender Male, 70.1%
Happy 55.7%
Calm 33.7%
Sad 5.6%
Confused 2.5%
Surprised 1.1%
Angry 1%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 23-35
Gender Female, 64.8%
Angry 29%
Happy 27%
Calm 21.9%
Confused 9.7%
Surprised 5.6%
Sad 5.4%
Disgusted 1%
Fear 0.3%

AWS Rekognition

Age 16-28
Gender Female, 98%
Calm 68.4%
Happy 17.4%
Sad 11.6%
Confused 0.9%
Angry 0.8%
Surprised 0.5%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 23-37
Gender Female, 92.4%
Calm 92.3%
Sad 3.7%
Happy 2%
Angry 1%
Confused 0.7%
Fear 0.1%
Surprised 0.1%
Disgusted 0%

AWS Rekognition

Age 25-39
Gender Female, 74.7%
Calm 73.3%
Surprised 18.9%
Sad 2.5%
Confused 2.3%
Happy 1.5%
Angry 0.8%
Fear 0.4%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Chair 99%

Categories

Imagga

paintings art 99.9%

Text analysis

Amazon

FLORIDA
SARASOTA,
STEINMETZ, SARASOTA, FLORIDA 25145
STEINMETZ,
25145
YТЭЗА-X

Google

STEINMETZ, SARASOTA, FLORIDA 2514S
STEINMETZ,
SARASOTA,
FLORIDA
2514S