Human Generated Data

Title

Untitled (bar at Bird Key Yacht Club, FL)

Date

c. 1965

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11475

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (bar at Bird Key Yacht Club, FL)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Restaurant 99.7
Person 99.2
Human 99.2
Person 99
Person 98.8
Person 98.2
Person 98.1
Person 97.7
Person 97.6
Furniture 97.1
Chair 97.1
Cafe 96.2
Person 95.3
Person 94.3
Person 89.3
Cafeteria 88
Chair 84.4
Chair 82.1
Person 78.7
Food 74.9
Meal 74.9
Food Court 73.5
Apparel 65.2
Clothing 65.2
Chair 62.7

Imagga
created on 2022-01-14

restaurant 37.9
room 31.4
cafeteria 27.5
chair 27.5
classroom 27.4
table 25.9
building 23.8
interior 23
people 22.9
shop 21.2
man 16.8
indoors 16.7
counter 16.7
structure 14.9
barbershop 14.8
modern 14.7
business 14.6
person 14.2
floor 13.9
mercantile establishment 13.6
male 13.5
work 13.4
hall 13.3
lifestyle 13
furniture 12.8
women 12.6
office 12.6
empty 12
sitting 12
urban 11.4
group 11.3
design 11.2
men 11.2
inside 11
house 10.9
shopping cart 10.8
supermarket 10.6
dinner 10.3
architecture 10.1
city 10
food 9.8
barroom 9.8
working 9.7
together 9.6
computer 9.6
home 9.6
boy 9.6
dining 9.5
adult 9.4
place of business 9.3
glass 9.3
casual 9.3
teacher 8.9
chairs 8.8
class 8.7
comfortable 8.6
learning 8.4
study 8.4
teamwork 8.3
bar 8.3
meal 8.3
technology 8.2
team 8.1
kitchen 8
seat 8
family 8
lunch 8
smiling 8
tables 7.9
day 7.8
education 7.8
communication 7.6
contemporary 7.5
drink 7.5
buy 7.5
wood 7.5
student 7.5
place 7.4
coffee 7.4
service 7.4
life 7.4
shopping 7.3
occupation 7.3
board 7.2
black 7.2
school 7.1
handcart 7.1
businessman 7.1

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

text 99.8
person 96.9
clothing 92.6
table 91.6
chair 90.8
furniture 85.3
man 77.3
woman 52.5
black and white 51.5

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 99.3%
Calm 86.6%
Sad 10.3%
Confused 1.6%
Disgusted 0.4%
Surprised 0.4%
Angry 0.4%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 48-54
Gender Male, 82%
Calm 81.4%
Happy 12.1%
Sad 3.1%
Confused 1.1%
Surprised 0.8%
Disgusted 0.6%
Angry 0.5%
Fear 0.4%

AWS Rekognition

Age 35-43
Gender Male, 99.4%
Happy 99.6%
Surprised 0.1%
Sad 0.1%
Disgusted 0%
Calm 0%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Male, 97.2%
Calm 58.2%
Sad 27.9%
Confused 6.4%
Disgusted 2.1%
Surprised 2%
Happy 1.7%
Fear 1.1%
Angry 0.7%

AWS Rekognition

Age 25-35
Gender Male, 88%
Calm 99.8%
Sad 0.1%
Confused 0%
Angry 0%
Happy 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Male, 51.5%
Sad 86.5%
Calm 11.9%
Confused 0.7%
Happy 0.3%
Fear 0.2%
Surprised 0.2%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 20-28
Gender Female, 54.7%
Sad 56.9%
Calm 40.4%
Confused 1%
Happy 0.5%
Angry 0.4%
Fear 0.3%
Surprised 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.2%
Chair 97.1%

Captions

Microsoft

a group of people standing in front of a building 79.3%
a group of people around each other 78.2%
a group of people sitting at a table 73.6%

Text analysis

Amazon

46660
АТОГАЯА2
АТОГАЯА2 STEMMENTS 46660
STEMMENTS

Google

AGigOJa „ATOZA9A2 STEMMET2 46469
„ATOZA9A2
AGigOJa
STEMMET2
46469