Human Generated Data

Title

Untitled (women working at tables, H.E.Harris Stamp Co.)

Date

July 7, 1954

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18108

Human Generated Data

Title

Untitled (women working at tables, H.E.Harris Stamp Co.)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

July 7, 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18108

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.3
Human 99.3
Person 99.1
Person 98.5
Building 98.5
Person 98
Person 97.5
Person 97.3
Person 97.3
Factory 97
Person 96.4
Person 94.6
Person 93.6
Assembly Line 84.6
Clothing 79.2
Apparel 79.2
Indoors 78.7
Room 78.2
Computer Keyboard 75.1
Electronics 75.1
Computer 75.1
Hardware 75.1
Keyboard 75.1
Computer Hardware 75.1
Workshop 74.8
Lab 70.8
Manufacturing 63
Housing 59.6
Classroom 58.6
School 58.6
Table 55.7
Furniture 55.7
Computer Keyboard 52.7

Clarifai
created on 2023-10-22

people 99.6
desk 98.9
group 98.7
furniture 98
many 97.3
group together 97.2
adult 97.1
room 96.2
woman 95.3
education 94.9
man 93.9
employee 92.2
school 90.1
classroom 89.8
child 85.3
sit 83.1
administration 82
several 79.8
elementary school 79.4
concentration 78.9

Imagga
created on 2022-03-04

room 71.6
classroom 69.9
restaurant 45.1
table 37.8
office 35
man 31.1
interior 29.2
people 27.3
business 25.5
cafeteria 25.3
male 24.8
meeting 24.5
work 24.3
businessman 23.8
desk 22.6
person 22.4
indoors 22
chair 20.3
computer 20.1
team 19.7
working 19.4
happy 19.4
building 19.3
businesswoman 19.1
group 18.5
sitting 18
executive 17.8
teamwork 17.6
laptop 17.6
professional 17.1
adult 16.7
businesspeople 16.1
modern 16.1
job 15.9
worker 15.6
corporate 15.5
men 15.5
communication 15.1
counter 14.7
smiling 14.5
dinner 14
drink 13.4
glass 13.2
structure 13
service 13
indoor 12.8
conference 12.7
kitchen 12.7
suit 12.6
dining 12.4
talking 12.4
lifestyle 12.3
women 11.9
food 11.8
shop 11.5
together 11.4
education 11.3
party 11.2
coffee 11.1
discussion 10.7
colleagues 10.7
workplace 10.5
center 10.1
occupation 10.1
smile 10
success 9.7
meal 9.5
salon 9.5
floor 9.3
lunch 9.2
confident 9.1
portrait 9.1
technology 8.9
employee 8.9
home 8.8
cooperation 8.7
class 8.7
contemporary 8.5
design 8.4
teacher 8.4
eat 8.4
house 8.4
bar 8.3
holding 8.3
successful 8.2
light 8
looking 8
monitor 7.9
collaboration 7.9
couple 7.8
conversation 7.8
empty 7.7
waiter 7.7
collar 7.7
formal 7.6
hall 7.6
horizontal 7.5
learning 7.5
manager 7.4
presentation 7.4
inside 7.4
cheerful 7.3
black 7.2
seat 7.1
furniture 7.1
decor 7.1

Google
created on 2022-03-04

Table 94.3
Black 89.7
Black-and-white 86.3
Desk 85.1
Style 83.9
Chair 83.9
Window 83.5
Building 82.4
Monochrome photography 73.8
Monochrome 73.4
Coffee table 72.1
Office equipment 70.7
Art 70.2
Suit 70.2
Machine 69.8
Indoor games and sports 67
Customer 67
Room 66.6
Sitting 66.2
Street 65.3

Microsoft
created on 2022-03-04

person 97.2
indoor 96.6
text 96
computer 77.1
clothing 76.2
furniture 61
table 34.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 97.8%
Calm 99.4%
Sad 0.4%
Confused 0%
Angry 0%
Surprised 0%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 39-47
Gender Male, 95.1%
Calm 91.9%
Happy 3.8%
Surprised 1.4%
Confused 1.3%
Sad 0.6%
Disgusted 0.5%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 48-54
Gender Male, 98%
Calm 99.1%
Confused 0.3%
Happy 0.2%
Sad 0.2%
Angry 0.1%
Disgusted 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 36-44
Gender Male, 96.6%
Calm 99.7%
Happy 0.1%
Sad 0.1%
Confused 0%
Surprised 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Female, 98.8%
Calm 90%
Sad 8.4%
Happy 0.4%
Surprised 0.3%
Fear 0.3%
Disgusted 0.3%
Angry 0.2%
Confused 0.1%

AWS Rekognition

Age 35-43
Gender Male, 99.5%
Sad 82.3%
Calm 10.5%
Confused 4.5%
Happy 0.9%
Angry 0.7%
Disgusted 0.5%
Surprised 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Computer Keyboard
Person 99.3%
Person 99.1%
Person 98.5%
Person 98%
Person 97.5%
Person 97.3%
Person 97.3%
Person 96.4%
Person 94.6%
Person 93.6%

Text analysis

Amazon

Je
YТ3°-

Google

YT37A°2-XA
YT37A°2-XA