Human Generated Data

Title

Untitled (man with two boys reading book)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17716

Human Generated Data

Title

Untitled (man with two boys reading book)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clinic 99.3
Person 98.8
Human 98.8
Hospital 98.4
Person 97.7
Operating Theatre 96.2
Person 91.8
Indoors 70.3
Room 69.5
Doctor 58.4
Surgery 55.2

Imagga
created on 2022-02-26

man 44.4
person 43
senior 42.2
male 39
adult 35.3
people 35.2
grandma 35.1
indoors 31.7
elderly 30.7
couple 30.5
home 29.5
happy 29.5
mature 27.9
smiling 26.8
sitting 26.7
executive 26.4
together 25.4
businessman 24.7
office 24.4
teacher 23.6
professional 23.5
men 23.2
meeting 22.6
entrepreneur 22.2
retired 21.3
retirement 21.1
lifestyle 21
old 20.9
laptop 20.1
women 19
business 18.8
computer 18.4
table 18.4
husband 18.1
talking 18.1
team 17.9
group 17.7
60s 17.6
older 17.5
portrait 17.5
businesswoman 17.3
room 16.7
educator 16.2
teamwork 15.8
pensioner 15.6
cheerful 15.5
casual 15.3
businesspeople 15.2
patient 15.2
indoor 14.6
looking 14.4
worker 14.4
wife 14.2
colleagues 13.6
aged 13.6
smile 13.6
horizontal 13.4
work 13.4
happiness 13.3
working 13.3
corporate 12.9
specialist 12.8
desk 12.6
face 12.1
modern 11.9
explaining 11.8
70s 11.8
job 11.5
medical 11.5
hand 11.4
enjoying 11.4
clothing 11.2
technology 11.1
hospital 10.8
handsome 10.7
two people 10.7
to 10.6
grandfather 10.5
suit 9.9
planner 9.9
senior adult 9.9
nurse 9.8
coworkers 9.8
health 9.7
education 9.5
bed 9.5
color 9.5
doctor 9.4
camera 9.2
communication 9.2
successful 9.2
leisure 9.1
confident 9.1
family 8.9
gray hair 8.9
seventies 8.9
discussing 8.8
casual clothing 8.8
conference 8.8
discussion 8.8
love 8.7
illness 8.6
manager 8.4
inside 8.3
holding 8.3
director 8.2
collaboration 7.9
sixties 7.9
40s 7.8
coat 7.8
mid adult 7.7
leader 7.7
boss 7.7
drinking 7.7
age 7.6
drink 7.5
friendship 7.5
glasses 7.4
occupation 7.3
day 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.7
human face 80.8
person 78.6
clothing 73.1
hospital room 64.8
black and white 64.1
room 59.4

Face analysis

Amazon

Google

AWS Rekognition

Age 9-17
Gender Female, 98.7%
Happy 91.1%
Fear 2.7%
Calm 2%
Surprised 1.6%
Sad 1.2%
Angry 0.7%
Confused 0.3%
Disgusted 0.3%

AWS Rekognition

Age 6-14
Gender Male, 52.5%
Calm 70.4%
Sad 27.7%
Fear 0.5%
Disgusted 0.3%
Surprised 0.3%
Confused 0.3%
Angry 0.2%
Happy 0.2%

AWS Rekognition

Age 31-41
Gender Male, 86.5%
Calm 46.5%
Angry 29.5%
Sad 10.9%
Surprised 4.7%
Happy 3.1%
Disgusted 2.4%
Confused 1.9%
Fear 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft

a group of people sitting at a table 86.3%
a group of people sitting around a table 85.3%
a man and a woman sitting at a table 75.9%

Text analysis

Amazon

43
BI
Parx
Less Parx
Less
KODOK-S.VEELA

Google

43
Pare
43 La Pare
La