Human Generated Data

Title

Untitled (man, two women, and two children standing outside church between trees)

Date

1933

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13014

Human Generated Data

Title

Untitled (man, two women, and two children standing outside church between trees)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

1933

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13014

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Clothing 100
Apparel 100
Person 99.4
Human 99.4
Dress 99.3
Person 98.5
Person 97.7
Person 97.6
Female 96.3
Robe 91.3
Fashion 91.3
Person 91
Gown 89.8
Woman 87.8
Shoe 84
Footwear 84
Suit 78.8
Overcoat 78.8
Coat 78.8
Wedding 77.2
People 76.8
Evening Dress 73.9
Wedding Gown 69.4
Bridegroom 64.9
Portrait 63.2
Photography 63.2
Face 63.2
Photo 63.2
Girl 60.7
Skirt 59.9
Meal 59.2
Food 59.2
Kid 58.6
Child 58.6
Home Decor 58.6
Long Sleeve 57.2
Sleeve 57.2
Plant 56.9
Dish 56.7
Bride 56.3
Door 55.3

Clarifai
created on 2023-10-29

people 100
child 99.1
group 99
woman 98.5
adult 98.3
man 98.2
family 96.3
wear 95.1
three 94.2
leader 92.6
wedding 92.4
four 91.9
veil 91.9
education 91.5
gown (clothing) 91.3
group together 91.2
ceremony 90.9
two 90.7
son 90.4
outfit 88.7

Imagga
created on 2022-02-05

guillotine 20.4
old 19.5
building 18.8
statue 18.6
man 18.1
architecture 16.5
instrument of execution 16.3
people 15.1
instrument 14.8
stone 14.3
male 14.3
monument 14
waiter 13.9
column 13.9
adult 13.5
sculpture 13.5
musical instrument 13.3
city 13.3
tourism 13.2
culture 12.8
black 12.7
history 12.5
window 12.2
historic 11.9
art 11.8
groom 11.7
religion 11.6
couple 11.3
ancient 11.2
device 11
dress 10.8
bride 10.5
dining-room attendant 10.4
men 10.3
church 10.2
person 10
travel 9.9
antique 9.7
historical 9.4
worker 9.1
vintage 9.1
catholic 9
wind instrument 8.9
business 8.5
employee 8.5
two 8.5
traditional 8.3
fashion 8.3
wedding 8.3
human 8.2
family 8
home 8
women 7.9
indoors 7.9
happiness 7.8
ceremony 7.8
marble 7.7
bouquet 7.5
light 7.5
inside 7.4
landmark 7.2
portrait 7.1
love 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 92.1
wedding dress 91.1
clothing 90.4
dress 85.8
person 84.8
black and white 83.1
bride 79.4
woman 70.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 66.8%
Happy 88.1%
Calm 9.8%
Surprised 0.8%
Fear 0.4%
Disgusted 0.3%
Sad 0.2%
Confused 0.2%
Angry 0.2%

AWS Rekognition

Age 31-41
Gender Female, 98.6%
Happy 96.7%
Calm 2.2%
Surprised 0.5%
Sad 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%
Confused 0%

AWS Rekognition

Age 47-53
Gender Male, 99.7%
Happy 91.3%
Calm 2.9%
Fear 1.6%
Surprised 1.5%
Sad 1.3%
Disgusted 0.8%
Confused 0.4%
Angry 0.3%

AWS Rekognition

Age 12-20
Gender Female, 71.4%
Sad 84.7%
Calm 10%
Happy 3.4%
Angry 0.6%
Disgusted 0.5%
Fear 0.3%
Confused 0.3%
Surprised 0.2%

AWS Rekognition

Age 30-40
Gender Female, 72.4%
Calm 89.1%
Sad 6.8%
Happy 2.5%
Confused 0.4%
Fear 0.4%
Angry 0.3%
Surprised 0.3%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.4%
Person 98.5%
Person 97.7%
Person 97.6%
Person 91%
Shoe 84%

Categories

Text analysis

Amazon

SI

Google

NAOON-YT A2-MAMTrZA3
NAOON-YT
A2-MAMTrZA3