Human Generated Data

Title

Untitled (people at an airport counter)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7044

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people at an airport counter)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 99.5
Person 99.5
Person 99.2
Person 99.2
Person 99.2
Person 98.9
Person 98.1
Person 98.1
Person 90.2
Person 89.7
Bag 87
Person 83.4
Person 75.3
Person 74.7
Luggage 69.3

Imagga
created on 2021-12-15

businessman 36.2
business 35.8
people 35.7
man 32.2
person 30.4
team 30.4
male 28.4
group 28.2
work 27.5
brass 26.7
wind instrument 26.2
teamwork 25
meeting 24.5
office 23.4
musical instrument 23.1
businesswoman 22.7
men 20.6
job 18.6
silhouette 18.2
corporate 18
communication 17.6
professional 17.3
adult 17.2
table 15.7
teacher 15.5
worker 15.4
executive 15.4
cornet 14.9
presentation 14.9
businesspeople 14.2
working 14.1
room 14.1
computer 13.8
sitting 13.7
boss 13.4
laptop 12.9
women 12.6
leader 12.5
couple 12.2
board 11.8
conference 11.7
happy 11.3
education 11.3
manager 11.2
seminar 10.8
colleagues 10.7
chair 10.7
occupation 10.1
employee 10
indoor 10
confident 10
modern 9.8
human 9.7
desk 9.7
indoors 9.7
success 9.6
together 9.6
crowd 9.6
career 9.5
finance 9.3
bright 9.3
successful 9.1
design 9
speech 8.8
audience 8.8
partner 8.7
smiling 8.7
partnership 8.6
student 8.6
nation 8.5
technology 8.2
suit 8.1
outfit 8.1
symbol 8.1
photographer 8
idea 8
lecture 7.9
president 7.8
happiness 7.8
discussion 7.8
clothing 7.8
corporation 7.7
class 7.7
patriotic 7.7
classroom 7.6
flag 7.3
sexy 7.2
looking 7.2
smile 7.1
day 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

person 95.9
text 92.7
clothing 85.8
man 67.9
black and white 64.5
group 62
old 41.4

Face analysis

Amazon

AWS Rekognition

Age 23-37
Gender Male, 64.3%
Calm 97.4%
Happy 1.6%
Sad 0.6%
Angry 0.1%
Confused 0.1%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 10-20
Gender Female, 68.8%
Happy 58.6%
Calm 31%
Sad 8.8%
Confused 0.7%
Angry 0.5%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 26-40
Gender Female, 57%
Happy 55.6%
Calm 25.8%
Sad 13.9%
Confused 3.2%
Angry 0.6%
Surprised 0.5%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 12-22
Gender Male, 84.7%
Calm 77.6%
Happy 13.9%
Sad 6.2%
Confused 1.4%
Angry 0.4%
Surprised 0.3%
Disgusted 0.1%
Fear 0.1%

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people standing in a room 89.3%
a group of people posing for a photo 71.5%
a group of people in a room 71.4%

Text analysis

Amazon

42
JSJ
KAGOY
YE3RAS KAGOY
YE3RAS

Google

42 YT37A2
42
YT37A2