Human Generated Data

Title

Untitled (men and women in newsroom)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16659

Human Generated Data

Title

Untitled (men and women in newsroom)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16659

Machine Generated Data

Tags

Amazon
created on 2022-02-18

Clinic 99.3
Person 98.3
Human 98.3
Hospital 97.3
Person 97.2
Person 96.1
Operating Theatre 95.6
Person 95.5
Person 95
Person 94.4
Person 93.6
Person 76.5
Indoors 63.7

Clarifai
created on 2023-10-29

people 99.2
group 97.7
child 97.5
indoors 97.4
education 97.3
woman 96.7
adult 96.3
room 95.7
man 95.5
classroom 94
sit 93.9
group together 93.3
monochrome 93.1
school 93
desk 92.9
furniture 91.4
chair 90
teacher 89.5
boy 86.4
elementary school 84.6

Imagga
created on 2022-02-18

salon 46.1
room 38.4
man 31.6
people 29.6
person 27.7
interior 25.6
male 25.5
office 24.6
clinic 23.6
indoors 22
table 21.4
professional 21.1
business 20.6
adult 20.2
shop 20.1
happy 20
home 19.9
modern 19.6
work 19.6
businessman 19.4
smiling 18.8
team 17.9
worker 17.8
working 17.7
computer 17.6
indoor 17.3
men 17.2
group 16.9
restaurant 16.6
classroom 16.3
teamwork 15.8
meeting 15.1
businesswoman 14.5
chair 14.4
businesspeople 14.2
medical 14.1
desk 13.9
corporate 13.7
lifestyle 13.7
laptop 13.7
barbershop 13.6
together 13.1
furniture 13.1
sitting 12.9
women 12.6
mercantile establishment 12.5
portrait 12.3
decor 11.5
couple 11.3
doctor 11.3
executive 11.2
kitchen 11.1
20s 11
counter 10.9
house 10.9
colleagues 10.7
talking 10.5
service 10.3
inside 10.1
smile 10
holding 9.9
conference 9.8
job 9.7
design 9.6
education 9.5
decoration 9.5
patient 9.5
mature 9.3
communication 9.2
drink 9.2
occupation 9.2
hospital 9
cheerful 8.9
technology 8.9
style 8.9
light 8.7
laboratory 8.7
test 8.7
luxury 8.6
nurse 8.5
manager 8.4
place of business 8.3
successful 8.2
employee 8.1
new 8.1
success 8
hall 8
associates 7.9
glass 7.8
lab 7.8
dinner 7.7
two 7.6
dining 7.6
clothing 7.6
enjoying 7.6
horizontal 7.5
contemporary 7.5
human 7.5
presentation 7.4
floor 7.4
board 7.4
food 7.4
equipment 7.3
color 7.2
suit 7.2
teacher 7.2
idea 7.1
coat 7.1
medicine 7
architecture 7

Google
created on 2022-02-18

Microsoft
created on 2022-02-18

table 96.1
person 95.8
indoor 91
text 90.7
clothing 73.7
woman 73.3
man 58.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 79.4%
Calm 96.1%
Confused 1%
Sad 1%
Happy 0.9%
Disgusted 0.4%
Angry 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 39-47
Gender Male, 97%
Calm 85.2%
Happy 6.5%
Confused 3.4%
Disgusted 2.6%
Sad 0.9%
Angry 0.8%
Surprised 0.3%
Fear 0.3%

AWS Rekognition

Age 25-35
Gender Female, 89.4%
Sad 81.6%
Fear 6.3%
Happy 5%
Calm 3.6%
Confused 1.3%
Angry 0.8%
Surprised 0.8%
Disgusted 0.7%

AWS Rekognition

Age 45-53
Gender Female, 97.8%
Sad 54%
Happy 31.2%
Disgusted 4.8%
Confused 2.7%
Calm 2.3%
Angry 2%
Surprised 1.7%
Fear 1.3%

AWS Rekognition

Age 23-31
Gender Male, 95.9%
Calm 40.7%
Fear 28.9%
Sad 14.3%
Angry 5.1%
Confused 4.1%
Disgusted 2.5%
Happy 2.3%
Surprised 2%

AWS Rekognition

Age 23-33
Gender Male, 56.5%
Sad 47%
Calm 38.9%
Confused 5.8%
Disgusted 2.2%
Happy 1.9%
Angry 1.7%
Fear 1.4%
Surprised 1.2%

AWS Rekognition

Age 45-53
Gender Female, 65.3%
Sad 97.4%
Calm 1.6%
Happy 0.4%
Surprised 0.2%
Angry 0.2%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 26-36
Gender Female, 69.4%
Sad 52%
Calm 35.7%
Confused 9.1%
Angry 1.3%
Happy 0.8%
Disgusted 0.6%
Surprised 0.3%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.3%
Person 97.2%
Person 96.1%
Person 95.5%
Person 95%
Person 94.4%
Person 93.6%
Person 76.5%

Categories

Imagga

interior objects 99.2%

Text analysis

Amazon

2
sue
KODOKEVEELA
PART