Human Generated Data

Title

Untitled (Penney store employees in mens shoe department)

Date

1951

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2524

Human Generated Data

Title

Untitled (Penney store employees in mens shoe department)

People

Artist: Harry Annas, American 1897 - 1980

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.3
Person 99.3
Person 99.2
Furniture 99
Indoors 98.9
Interior Design 98.9
Person 98.7
Person 98.5
Person 98.1
Person 97.2
Room 97.2
Person 97
Person 96.3
Clothing 96.2
Footwear 96.2
Apparel 96.2
Shoe 96.2
Person 91.6
Classroom 76
School 76
Clinic 73.8
Chair 67.9
Bedroom 64.2
People 64.2
Person 62.8
Living Room 60.9
Chair 60.4
Shoe 60.4
Workshop 57.4
Chair 51.5

Imagga
created on 2022-03-05

classroom 43
brass 39.9
room 36.6
wind instrument 35.3
businessman 33.5
people 33.5
male 31.9
business 31.6
person 30.2
men 28.3
man 28.2
group 27.4
musical instrument 26
adult 23.4
meeting 22.6
professional 21.8
corporate 21.5
businesswoman 20
women 19.8
team 19.7
happy 19.4
executive 19.3
cornet 19.3
office 19.3
teacher 17.4
job 16.8
teamwork 16.7
smiling 15.9
suit 15.3
worker 15.3
work 14.9
table 14.7
confident 14.5
communication 14.3
silhouette 14.1
employee 13.7
businesspeople 13.3
couple 13.1
boy 13
laptop 12.7
black 12.6
modern 12.6
indoors 12.3
together 12.3
board 11.8
lifestyle 11.6
chair 11.4
education 11.3
manager 11.2
sitting 11.2
portrait 11
handsome 10.7
interior 10.6
boss 10.5
success 10.5
successful 10.1
indoor 10
diversity 9.6
desk 9.4
presentation 9.3
life 9.2
device 9.2
girls 9.1
attractive 9.1
holding 9.1
cheerful 8.9
conference 8.8
colleagues 8.7
standing 8.7
class 8.7
happiness 8.6
talking 8.6
smile 8.5
casual 8.5
finance 8.4
building 8.4
study 8.4
fashion 8.3
hall 8.1
computer 8
working 7.9
hands 7.8
school 7.7
youth 7.7
blackboard 7.6
friendship 7.5
fun 7.5
student 7.4
copy space 7.2
looking 7.2
home 7.2
family 7.1

Google
created on 2022-03-05

Furniture 93.5
Black 89.9
Table 88.2
Black-and-white 86.8
Window 86.2
Chair 85.3
Style 84
Monochrome 77.5
Monochrome photography 76.5
Snapshot 74.3
Event 69.7
Vintage clothing 69.6
Room 67.5
Class 63.4
Stock photography 63
Sitting 62.3
Child 60.5
Suit 55.4
Shorts 55.1
School 54.3

Microsoft
created on 2022-03-05

person 94.4
table 90
text 88.5
clothing 84.2
furniture 81.7
footwear 65
chair 62.7
man 62.4
wheelchair 60.4
tennis 53.4

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Sad 92.5%
Calm 3.2%
Disgusted 1.5%
Confused 0.9%
Happy 0.7%
Angry 0.6%
Surprised 0.5%
Fear 0.2%

AWS Rekognition

Age 28-38
Gender Male, 73.9%
Happy 96%
Calm 2.8%
Sad 0.4%
Confused 0.2%
Angry 0.2%
Fear 0.2%
Surprised 0.2%
Disgusted 0.2%

AWS Rekognition

Age 29-39
Gender Male, 64.3%
Calm 44.5%
Happy 39.8%
Sad 5.2%
Surprised 2.9%
Disgusted 2.5%
Fear 2.5%
Confused 1.4%
Angry 1.3%

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Calm 79.9%
Confused 10.2%
Angry 4.7%
Sad 1.6%
Disgusted 1.3%
Surprised 1.1%
Fear 0.7%
Happy 0.5%

AWS Rekognition

Age 51-59
Gender Male, 98%
Happy 66.7%
Calm 30.6%
Sad 1%
Confused 0.5%
Disgusted 0.4%
Angry 0.4%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Sad 70.2%
Happy 14.6%
Calm 8.3%
Angry 3.2%
Confused 1.2%
Disgusted 1%
Fear 0.9%
Surprised 0.6%

AWS Rekognition

Age 36-44
Gender Male, 100%
Calm 54.1%
Sad 14.7%
Happy 12.6%
Angry 8.9%
Surprised 4.5%
Disgusted 2.5%
Confused 1.7%
Fear 1%

AWS Rekognition

Age 45-51
Gender Male, 72%
Calm 76.2%
Confused 10%
Surprised 4.2%
Sad 4.2%
Angry 2.1%
Disgusted 1.6%
Happy 1%
Fear 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Shoe 96.2%
Chair 67.9%

Captions

Microsoft

a group of people standing in front of a building 81.1%
a group of people standing in a room 81%
a group of people in a room 80.9%

Text analysis

Amazon

XXX
KUDAK-SALETA

Google

YTヨコA2- A
ヨコ
A
A2-
YT