Human Generated Data

Title

Untitled (group of men playing billiards)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7286

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of men playing billiards)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7286

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Clinic 99.9
Person 99.8
Human 99.8
Operating Theatre 99.8
Hospital 99.8
Person 99.5
Person 99.4
Person 99.3
Person 98.8
Person 98.7
Person 98.2
Room 97.9
Indoors 97.9
Person 95.1
Surgery 93.4
Doctor 93.4
Furniture 69.8
Table 62.3

Clarifai
created on 2023-10-26

people 99.7
group 98.8
adult 98.5
man 98.2
group together 97.9
many 96.4
several 94.6
vehicle 94
indoors 92.9
monochrome 92.6
watercraft 91.6
three 88.3
administration 87.9
transportation system 87.2
five 86.8
woman 85.2
furniture 83
four 82.9
recreation 82.2
military 81.3

Imagga
created on 2022-01-15

stage 23.7
percussion instrument 21.9
man 21.5
people 20.6
musical instrument 20.5
device 17.7
platform 17.6
person 16.5
male 15.6
adult 14
education 13.8
classroom 13.8
teacher 13.3
black 13.2
lifestyle 13
men 12.9
school 12.6
equipment 12.2
class 11.6
hand 11.4
computer 11.3
board 10.9
youth 10.2
student 10.1
vibraphone 9.9
music 9.9
modern 9.8
business 9.7
technology 9.6
smiling 9.4
horizontal 9.2
room 9.1
marimba 9.1
holding 9.1
group 8.9
teaching 8.8
child 8.7
boy 8.7
water 8.7
old 8.3
color 8.3
silhouette 8.3
interior 8
indoors 7.9
design 7.9
work 7.8
smile 7.8
portrait 7.8
sitting 7.7
chair 7.7
grunge 7.7
sky 7.6
happy 7.5
fun 7.5
outdoors 7.5
light 7.3
blackboard 7.2
table 7.2
transportation 7.2
kitchen 7.1
women 7.1
desk 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.2
outdoor 85.8
table 73.8
person 70.2
black and white 60.3
man 56.5
funeral 54
furniture 50.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Calm 97.8%
Sad 1.3%
Surprised 0.4%
Happy 0.2%
Confused 0.2%
Disgusted 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 39-47
Gender Male, 99.8%
Calm 99.8%
Surprised 0.1%
Sad 0%
Disgusted 0%
Happy 0%
Confused 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 43-51
Gender Male, 99.4%
Calm 97.4%
Sad 0.9%
Surprised 0.8%
Confused 0.4%
Angry 0.2%
Happy 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 99.6%
Surprised 93.5%
Happy 3.1%
Calm 2.1%
Fear 0.5%
Confused 0.2%
Disgusted 0.2%
Angry 0.2%
Sad 0.2%

AWS Rekognition

Age 28-38
Gender Female, 97.9%
Calm 99.8%
Sad 0.1%
Confused 0.1%
Happy 0%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Male, 99.4%
Calm 100%
Surprised 0%
Sad 0%
Happy 0%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Male, 99.9%
Calm 54.4%
Surprised 31.2%
Happy 7.9%
Sad 2%
Disgusted 1.9%
Angry 1.5%
Fear 0.6%
Confused 0.6%

AWS Rekognition

Age 21-29
Gender Male, 97.7%
Calm 87.2%
Surprised 11.9%
Confused 0.3%
Disgusted 0.2%
Angry 0.2%
Sad 0.1%
Fear 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Text analysis

Amazon

FLORIDA
SARASOTA,
STEINMETZ, SARASOTA, FLORIDA
STEINMETZ,
25778
KODVK-SVEELA

Google

25778 STEINMETZ, SARASOTA, FLORIDA YT3RA2-A
25778
STEINMETZ,
SARASOTA,
FLORIDA
YT3RA2-A