Human Generated Data

Title

Untitled (two young women praying in church)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17947

Human Generated Data

Title

Untitled (two young women praying in church)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17947

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Clothing 99.8
Apparel 99.8
Person 98.4
Human 98.4
Person 97.9
Veil 97.7
Female 81.3
Tie 75.8
Accessories 75.8
Accessory 75.8
Woman 68.8
Plant 67
Suit 63.4
Coat 63.4
Overcoat 63.4
Portrait 62.7
Face 62.7
Photography 62.7
Photo 62.7
Meal 61.8
Food 61.8
Indoors 58.8
Room 57.6
Dish 55.4
Gown 55
Fashion 55

Clarifai
created on 2023-10-29

monochrome 99.1
people 99.1
woman 97.8
wedding 97.6
indoors 97.6
adult 97.3
man 95.9
two 92.7
bride 89.8
wear 89.5
portrait 86.4
veil 86
mirror 85.3
child 85.3
three 84.3
dressing room 83.7
groom 83
group 82.2
sit 80.6
one 79.7

Imagga
created on 2022-03-04

person 33.8
man 31
office 30.6
business 29.7
people 29.5
computer 28.9
laptop 27.5
television 25
adult 24.8
teacher 24.8
professional 24.6
male 24.1
happy 23.8
businessman 22.9
work 22.7
working 21.2
telecommunication system 19.8
home 19.1
sitting 18.9
smiling 18.1
technology 17.8
job 17.7
indoors 16.7
education 16.4
desk 16
student 15.6
portrait 15.5
room 15.2
classroom 14.9
businesswoman 14.5
smile 14.2
case 14.2
educator 14.1
corporate 13.7
group 13.7
monitor 13.5
success 12.9
looking 12.8
indoor 12.8
screen 12.6
modern 12.6
team 12.5
worker 12.4
meeting 12.2
patient 11.9
communication 11.7
suit 11.7
lifestyle 11.6
senior 11.2
executive 11.1
school 11
finance 11
blackboard 10.9
board 10.8
class 10.6
attractive 10.5
career 10.4
notebook 10.3
study 10.2
manager 10.2
teamwork 10.2
horizontal 10
interior 9.7
men 9.4
occupation 9.2
holding 9.1
old 9.1
human 9
cheerful 8.9
happiness 8.6
college 8.5
writing 8.5
equipment 8.4
hand 8.4
back 8.3
successful 8.2
one 8.2
confident 8.2
building 8.1
women 7.9
face 7.8
table 7.8
university 7.8
space 7.8
pretty 7.7
studying 7.7
casual 7.6
coat 7.5
doctor 7.5
director 7.5
sick person 7.4
lady 7.3
world 7.3
film 7.2
color 7.2
handsome 7.1
science 7.1
to 7.1
medical 7.1

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 99.6
wedding dress 91.5
black and white 88.4
bride 82.1
woman 78.3
clothing 67.4
human face 66.9
dress 62.5
person 56.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 59.3%
Calm 98.8%
Sad 0.8%
Happy 0.2%
Surprised 0.1%
Confused 0.1%
Disgusted 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 30-40
Gender Female, 50.3%
Calm 99.2%
Sad 0.6%
Confused 0.1%
Happy 0%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 18-24
Gender Female, 95.3%
Sad 85.8%
Fear 9.7%
Calm 1.2%
Happy 0.9%
Disgusted 0.8%
Angry 0.8%
Confused 0.5%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Person 98.4%
Person 97.9%
Tie 75.8%

Categories

Text analysis

Google

MJI7-- Y T 3RA°2-- AGO
MJI7--
Y
T
3RA°2--
AGO