Human Generated Data

Title

Untitled (woman standing around table with coffee and dishes)

Date

c. 1960

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11540

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman standing around table with coffee and dishes)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11540

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.4
Human 99.4
Person 98.6
Person 98.3
Person 98.2
Person 98.1
Person 96
Person 94.9
Person 94.8
Chair 93.7
Furniture 93.7
Pub 81.5
Bar Counter 75.2
Meal 74.3
Food 74.3
Crowd 72.9
Table 71.2
Person 70.1
Person 66.8
Stage 61.3
People 59.6
Restaurant 56.1

Clarifai
created on 2023-10-25

people 99.4
group together 96.4
group 95.7
music 95.2
indoors 94.7
musician 94.5
man 92.9
woman 90.1
adult 89.6
furniture 89.4
room 89.4
audience 88.8
singer 88.7
bar 88.6
chair 86.6
many 86.4
sit 83.8
monochrome 83.1
jazz 82
band 81.6

Imagga
created on 2022-01-15

marimba 66.7
percussion instrument 62.8
musical instrument 51.3
people 31.2
male 28.3
man 24.9
person 24.4
group 20.9
blackboard 19.8
silhouette 19.8
classroom 18.2
men 17.2
businessman 16.8
business 15.2
teacher 14.9
education 14.7
student 14.5
black 14.4
happy 13.1
office 12.8
room 12.8
school 12.6
class 12.5
women 11.9
work 11.8
adult 11.6
couple 11.3
boy 11.3
success 11.3
design 11.2
old 11.1
smiling 10.8
crowd 10.5
youth 10.2
music 9.9
team 9.8
teaching 9.7
sitting 9.4
symbol 9.4
two 9.3
musician 9.2
desk 9.2
hand 9.1
modern 9.1
board 9
professional 9
table 9
vibraphone 8.9
night 8.9
practice 8.7
art 8.7
patriotic 8.6
meeting 8.5
film 8.4
study 8.4
stage 8.3
sign 8.3
human 8.2
looking 8
job 8
vibrant 7.9
smile 7.8
nighttime 7.8
stadium 7.8
concert 7.8
portrait 7.8
drawing 7.7
diagram 7.7
grunge 7.7
musical 7.7
chart 7.6
child 7.6
teamwork 7.4
lights 7.4
star 7.4
flag 7.3
cheerful 7.3
icon 7.1
to 7.1
player 7

Google
created on 2022-01-15

Black 89.7
Drinkware 85.5
Style 83.8
Black-and-white 83.7
Line 81.7
Barware 77.1
Font 76.1
Monochrome photography 74.7
Monochrome 74.1
Event 71.7
Glass 70.4
Display case 68.5
Room 61.5
Entertainment 54.9
Team 52
Recreation 51.8
Visual arts 50.1

Microsoft
created on 2022-01-15

text 99.4
person 90.7
clothing 89.8
posing 89.3
old 79.6
black 70.4
player 69.4
man 68.2
group 60.6
black and white 60
team 37.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Female, 82.9%
Happy 50.7%
Calm 34.9%
Sad 4.8%
Fear 2.8%
Angry 2.1%
Disgusted 1.7%
Surprised 1.7%
Confused 1.3%

AWS Rekognition

Age 39-47
Gender Male, 99.7%
Happy 48.5%
Calm 45.9%
Surprised 1.9%
Confused 1.2%
Sad 0.8%
Angry 0.6%
Disgusted 0.6%
Fear 0.6%

AWS Rekognition

Age 49-57
Gender Male, 96.9%
Calm 60.1%
Happy 28.6%
Angry 4.6%
Surprised 3.3%
Disgusted 1.2%
Sad 1.2%
Confused 0.6%
Fear 0.4%

AWS Rekognition

Age 48-54
Gender Male, 76.2%
Calm 95.2%
Happy 1.6%
Surprised 1.3%
Sad 0.7%
Angry 0.5%
Disgusted 0.4%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 38-46
Gender Male, 85%
Calm 57.4%
Sad 32.5%
Confused 4.6%
Happy 2.9%
Angry 1%
Surprised 0.7%
Disgusted 0.6%
Fear 0.3%

AWS Rekognition

Age 40-48
Gender Female, 69%
Calm 52.6%
Happy 45.9%
Sad 0.7%
Confused 0.3%
Surprised 0.2%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 54-64
Gender Male, 95.3%
Calm 89.8%
Happy 2.3%
Surprised 2%
Fear 1.7%
Confused 1.6%
Angry 1.1%
Sad 0.9%
Disgusted 0.6%

AWS Rekognition

Age 48-54
Gender Male, 77.9%
Happy 93.6%
Calm 3.3%
Angry 1%
Disgusted 0.8%
Surprised 0.5%
Sad 0.4%
Fear 0.3%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Chair 93.7%

Text analysis

Amazon

FLORIDA
SARASOTA.
STEINMETZ. SARASOTA. FLORIDA 47409.
STEINMETZ.
47409.
MAGOX
MILD

Google

STEINMETZ. SARASOTA. FLORIDA U7407.
STEINMETZ.
SARASOTA.
FLORIDA
U7407.