Human Generated Data

Title

Untitled (residents seated at tables around pool, Ellenton, Florida)

Date

c. 1960

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11543

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (residents seated at tables around pool, Ellenton, Florida)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11543

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Restaurant 99.9
Person 99.5
Human 99.5
Person 99.3
Person 99.2
Chair 98.3
Furniture 98.3
Cafeteria 97.6
Person 96
Person 91.1
Chair 90.8
Person 89.1
Person 88.2
Cafe 81.1
Person 80.7
Room 80.3
Indoors 80.3
Chair 78
Person 75.1
Person 72.7
Food Court 71.9
Food 71.9
Person 71.5
Meal 71
Table 67.1
Dining Table 65.5
Person 55.7
Person 52.7
Person 49.6

Clarifai
created on 2023-10-25

people 99.3
group 95.4
adult 94.2
classroom 93.7
man 93.4
group together 93.3
woman 92.5
education 91.5
school 91.4
many 91.2
chair 90.6
sit 90.4
furniture 90.1
monochrome 89.2
desk 87.9
teacher 86.9
indoors 85.6
room 84.1
boy 82
recreation 81.6

Imagga
created on 2022-01-15

room 60.8
classroom 46.3
table 43.8
restaurant 37.5
interior 33.6
chair 32.3
cafeteria 31.2
group 28.2
modern 25.9
people 23.4
design 23.1
office 22.3
business 21.3
hall 21.1
building 20.3
person 19.6
furniture 19.6
indoors 17.6
empty 17.2
man 16.8
center 16.8
computer network 16.6
work 16.5
monitor 16.2
dining 16.2
meeting 16
businessman 15.9
floor 15.8
male 15.6
contemporary 15
desk 14.9
professional 14.3
wood 14.2
teacher 14.1
computer 14.1
inside 13.8
structure 13.8
chairs 13.7
indoor 13.7
seat 13.6
team 13.4
men 12.9
dinner 12.6
network 12.6
equipment 12.6
class 12.5
house 12.5
glass 12.4
education 12.1
teamwork 12
coffee 12
corporate 12
bar 12
home 12
businesswoman 11.8
communication 11.8
kitchen 11.6
decor 11.5
food 11.5
comfortable 11.5
electronic equipment 10.9
board 10.9
tables 10.8
conference 10.7
audience 10.7
success 10.5
light 10
drink 10
adult 9.7
window 9.7
executive 9.6
talking 9.5
women 9.5
learning 9.4
architecture 9.4
manager 9.3
presentation 9.3
lights 9.3
eat 9.2
blackboard 9.2
worker 9.2
silhouette 9.1
meal 8.9
style 8.9
job 8.8
pop 8.7
decoration 8.7
sitting 8.6
luxury 8.6
lunch 8.5
student 8.5
elegance 8.4
school 8.3
technology 8.2
new 8.1
smiling 8
laptop 7.9
stool 7.9
bright 7.9
students 7.8
live 7.8
teaching 7.8
scene 7.8
guitar 7.8
concert 7.8
counter 7.7
classical 7.6
happy 7.5
row 7.5
occupation 7.3
entrepreneur 7.3
suit 7.2
musician 7.1
working 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 100
table 97.1
person 92.9
furniture 92.2
chair 90.2
people 82.2
clothing 70.3
white 67

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 58.1%
Calm 96.1%
Sad 1.8%
Disgusted 0.7%
Fear 0.7%
Happy 0.2%
Angry 0.2%
Surprised 0.2%
Confused 0.1%

AWS Rekognition

Age 34-42
Gender Female, 73%
Calm 99.9%
Sad 0%
Fear 0%
Happy 0%
Disgusted 0%
Surprised 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 51-59
Gender Male, 99.8%
Calm 99.4%
Happy 0.2%
Surprised 0.1%
Sad 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 40-48
Gender Male, 61%
Calm 96.5%
Surprised 1.8%
Sad 0.7%
Happy 0.5%
Disgusted 0.2%
Angry 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 48-54
Gender Female, 92.6%
Happy 74.2%
Calm 21.7%
Sad 3.1%
Disgusted 0.3%
Surprised 0.2%
Angry 0.2%
Fear 0.2%
Confused 0.1%

AWS Rekognition

Age 20-28
Gender Female, 55.2%
Calm 90.1%
Sad 3.5%
Happy 1.8%
Angry 1.5%
Confused 0.9%
Surprised 0.8%
Disgusted 0.8%
Fear 0.7%

AWS Rekognition

Age 39-47
Gender Male, 69.3%
Calm 83.8%
Happy 7.5%
Surprised 3.4%
Disgusted 1.3%
Confused 1.3%
Fear 1.3%
Sad 0.9%
Angry 0.6%

AWS Rekognition

Age 36-44
Gender Female, 70.6%
Calm 69%
Fear 15.6%
Sad 8.3%
Happy 1.9%
Confused 1.8%
Disgusted 1.4%
Surprised 1%
Angry 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.5%
Chair 98.3%

Text analysis

Amazon

47415
FLORIDA
SARASOTA.
STEINMETZ. SARASOTA. FLORIDA
STEINMETZ.
م
SIhLh م
SIhLh

Google

47415
47415 STEINMETZ. SARASOTA. FLORIDA
STEINMETZ.
SARASOTA.
FLORIDA