Human Generated Data

Title

Untitled (man and woman on chairs, many people standing around)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16710

Human Generated Data

Title

Untitled (man and woman on chairs, many people standing around)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16710

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.8
Human 99.8
Person 99.5
Person 99.1
Person 99
Person 99
Person 98.3
Person 97.8
Clinic 97.4
Person 96.6
Person 96.6
Person 95.7
Hospital 85.3
Person 83
Chair 81.8
Furniture 81.8
Indoors 79.3
Chair 75.4
Operating Theatre 75.2
Person 75
Room 74.6
Person 62.3
People 58.9
Interior Design 58.7
Person 53

Clarifai
created on 2023-10-29

people 99.9
group 99.5
man 98
education 97.8
woman 96.8
school 96.8
adult 96.5
child 96.2
teacher 95.6
room 94.4
indoors 94.1
group together 92.3
many 86.4
leader 86.4
administration 85.7
elementary school 85.4
class 84.9
classroom 82.5
family 82.4
boy 81.9

Imagga
created on 2022-02-26

teacher 60.1
educator 42.2
person 41.9
professional 40.9
adult 40.5
man 39
people 33.5
male 33.3
room 31.2
office 27.1
business 22.5
businessman 22.1
indoors 22
smiling 21.7
home 21.5
classroom 21.5
happy 21.3
looking 20.8
work 20.4
computer 18.5
patient 17.8
laptop 17.3
education 16.5
sitting 16.3
group 16.1
casual 16.1
portrait 15.5
modern 14.7
indoor 14.6
corporate 14.6
men 14.6
team 14.3
life 14.1
working 14.1
nurse 14.1
lifestyle 13.7
women 13.4
job 13.3
desk 13.2
senior 13.1
mature 13
blackboard 12.9
student 12.1
technology 11.9
holding 11.6
interior 11.5
medical 11.5
meeting 11.3
standing 11.3
doctor 11.3
phone 11.1
communication 10.9
clothing 10.9
businesswoman 10.9
executive 10.9
clinic 10.9
board 10.9
school 10.8
smile 10.7
cheerful 10.6
coat 10.5
couple 10.5
health 10.4
table 10.4
manager 10.2
teamwork 10.2
occupation 10.1
horizontal 10
handsome 9.8
case 9.8
lab coat 9.7
exam 9.6
businesspeople 9.5
color 9.5
happiness 9.4
camera 9.2
face 9.2
house 9.2
20s 9.2
worker 9
medicine 8.8
hospital 8.7
elderly 8.6
smart 8.5
two 8.5
building 8.4
together 7.9
bright 7.9
day 7.8
40s 7.8
two people 7.8
space 7.8
talking 7.6
human 7.5
care 7.4
glasses 7.4
aged 7.2
copy space 7.2
suit 7.2
newspaper 7.2
shop 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 94.4
clothing 92.5
person 90.7
man 71.5
drawing 66.7
woman 52

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 95.2%
Sad 45.5%
Happy 19.7%
Calm 13.9%
Confused 8.9%
Surprised 3.6%
Angry 3.2%
Fear 2.8%
Disgusted 2.4%

AWS Rekognition

Age 27-37
Gender Female, 69%
Fear 37.9%
Happy 29.4%
Calm 16.6%
Sad 6.1%
Confused 4%
Surprised 2.8%
Angry 2%
Disgusted 1.2%

AWS Rekognition

Age 28-38
Gender Female, 97.2%
Happy 85.6%
Calm 7.6%
Sad 4.6%
Confused 0.9%
Surprised 0.4%
Fear 0.3%
Disgusted 0.3%
Angry 0.2%

AWS Rekognition

Age 31-41
Gender Female, 81%
Happy 57.8%
Sad 21.7%
Confused 12.2%
Calm 3.1%
Disgusted 1.7%
Angry 1.3%
Fear 1.3%
Surprised 0.9%

AWS Rekognition

Age 27-37
Gender Male, 98.2%
Sad 65.1%
Confused 15.3%
Calm 15.1%
Happy 1.8%
Fear 1%
Disgusted 0.7%
Angry 0.5%
Surprised 0.5%

AWS Rekognition

Age 19-27
Gender Male, 81.6%
Calm 97%
Happy 1%
Sad 0.7%
Fear 0.4%
Angry 0.4%
Confused 0.2%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 27-37
Gender Female, 94.8%
Sad 52.2%
Confused 28.7%
Calm 16.2%
Happy 1.6%
Fear 0.4%
Angry 0.4%
Disgusted 0.3%
Surprised 0.2%

AWS Rekognition

Age 26-36
Gender Male, 69.8%
Calm 83.5%
Sad 7.8%
Happy 4.1%
Confused 1.6%
Fear 1.1%
Angry 0.9%
Disgusted 0.7%
Surprised 0.3%

AWS Rekognition

Age 23-33
Gender Female, 73%
Calm 56.4%
Happy 17.2%
Sad 16.5%
Angry 3.1%
Confused 2.9%
Disgusted 1.8%
Fear 1%
Surprised 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99.8%
Person 99.5%
Person 99.1%
Person 99%
Person 99%
Person 98.3%
Person 97.8%
Person 96.6%
Person 96.6%
Person 95.7%
Person 83%
Person 75%
Person 62.3%
Person 53%
Chair 81.8%
Chair 75.4%

Categories

Imagga

text visuals 87.7%
interior objects 9.9%

Text analysis

Amazon

19
KODAKA-ITW

Google

MJI7-- YT37A°2 - -XAGON
MJI7--
YT37A°2
-
-XAGON