Human Generated Data

Title

Untitled (Penney store employees in mens shoe department)

Date

1951

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2523

Human Generated Data

Title

Untitled (Penney store employees in mens shoe department)

People

Artist: Harry Annas, American 1897 - 1980

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2523

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 99.4
Person 99.1
Human 99.1
Person 99
Person 98.3
Person 98.1
Person 97.4
Person 97
Person 96.2
Restaurant 96.1
Person 95.7
Dining Table 93.9
Table 93.9
Room 91
Indoors 91
Meal 89
Food 89
Person 87
Interior Design 86.2
Couch 80.6
Cafeteria 77
Dining Room 76.2
Chair 73.7
Cafe 67.8
People 65
Female 64.7
Kid 63.6
Child 63.6
Shorts 60.2
Clothing 60.2
Apparel 60.2
Living Room 59.8
Dish 58.5
Classroom 57.6
School 57.6
Food Court 57.3
Floor 55.8
Diner 55.7
Girl 55.4
Person 49.4

Clarifai
created on 2023-10-28

people 99.6
education 98
room 97.2
furniture 96.1
indoors 95.7
classroom 95.7
group 95.1
man 94.6
group together 94.5
school 94.4
desk 94
teacher 94
elementary school 94
child 93.5
monochrome 92.5
adult 92.4
woman 92.3
chair 88.9
family 87.1
table 83.6

Imagga
created on 2022-03-05

classroom 42.7
room 36.2
people 29
man 28.3
person 23.4
sport 23
business 21.2
male 20.6
adult 20.1
lifestyle 16.6
women 16.6
men 16.3
group 15.3
businessman 15
competition 14.6
fitness 14.4
teacher 13.8
exercise 13.6
portrait 13.6
gymnasium 13.2
athlete 12.8
chair 12.7
casual 12.7
board 12
corporate 12
human 12
center 11.8
tennis 11.7
suit 11.7
active 11.3
attractive 11.2
professional 11.1
blackboard 10.9
recreation 10.7
holding 10.7
table 10.6
sitting 10.3
worker 10.1
office 10
indoor 10
happy 10
businesswoman 10
athletic facility 9.8
court 9.7
interior 9.7
outdoors 9.7
class 9.6
education 9.5
work 9.4
smiling 9.4
action 9.3
modern 9.1
silhouette 9.1
team 9
cheerful 8.9
equipment 8.8
attire 8.8
healthy 8.8
indoors 8.8
couple 8.7
day 8.6
happiness 8.6
ball 8.4
hand 8.4
black 8.4
pretty 8.4
city 8.3
leisure 8.3
executive 8.3
fit 8.3
school 8.2
employee 8.1
net 8.1
success 8
building 7.9
trainer 7.9
boy 7.8
students 7.8
student 7.8
teaching 7.8
run 7.7
outside 7.7
facility 7.7
boss 7.6
outdoor 7.6
health 7.6
fun 7.5
training 7.4
handsome 7.1
smile 7.1
job 7.1
life 7
clothing 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.4
floor 96.5
table 95.9
furniture 90.8
person 85.9
footwear 65.3
chair 53.1
clothing 53
man 51.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 96.8%
Happy 40.7%
Sad 38.6%
Calm 13.7%
Confused 4%
Disgusted 1.5%
Surprised 0.7%
Angry 0.6%
Fear 0.3%

AWS Rekognition

Age 41-49
Gender Male, 99.9%
Calm 97.9%
Sad 0.9%
Confused 0.8%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0%
Happy 0%

AWS Rekognition

Age 35-43
Gender Male, 97.6%
Calm 58.8%
Sad 21.9%
Confused 14.2%
Fear 1.8%
Disgusted 1.1%
Happy 1%
Surprised 0.8%
Angry 0.5%

AWS Rekognition

Age 39-47
Gender Male, 89.1%
Sad 54.9%
Happy 22.3%
Calm 11.3%
Confused 6.3%
Disgusted 1.9%
Angry 1.4%
Fear 1.2%
Surprised 0.6%

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Happy 30.4%
Surprised 18.3%
Angry 17.8%
Calm 14.1%
Sad 9.4%
Fear 5.3%
Confused 2.7%
Disgusted 1.9%

AWS Rekognition

Age 23-33
Gender Male, 98.1%
Calm 35.4%
Sad 26.9%
Happy 13%
Confused 9.5%
Angry 5.1%
Surprised 4.3%
Disgusted 3.6%
Fear 2.2%

AWS Rekognition

Age 38-46
Gender Female, 89%
Calm 89.7%
Surprised 4.7%
Confused 2.9%
Sad 1.7%
Disgusted 0.4%
Happy 0.3%
Angry 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Likely

Feature analysis

Amazon

Person
Chair
Person 99.1%
Person 99%
Person 98.3%
Person 98.1%
Person 97.4%
Person 97%
Person 96.2%
Person 95.7%
Person 87%
Person 49.4%
Chair 73.7%

Categories

Text analysis

Amazon

III
KUDAK-SALETA

Google

YT37A2-XA
YT37A2-XA