Human Generated Data

Title

Untitled (women in veiled costumes giving flower offerings to statue in church)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14543

Human Generated Data

Title

Untitled (women in veiled costumes giving flower offerings to statue in church)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14543

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Person 98.7
Human 98.7
Person 98.1
Person 93.5
Indoors 92.7
Room 89.1
Clothing 81.3
Apparel 81.3
Plant 69.4
Person 67.8
Art 67.6
Flower 65.1
Blossom 65.1
People 60.6
Handrail 58.8
Banister 58.8
Floral Design 56.8
Graphics 56.8
Pattern 56.8
Wedding 56.2
Fashion 56.1
Gown 55.7
Flower Arrangement 55.4
Robe 55.2

Clarifai
created on 2023-10-29

people 100
group 99.3
group together 98.2
adult 97.7
woman 96.3
many 95.8
several 95.6
leader 95.1
man 94.9
administration 94.8
actress 94.1
home 93.7
child 93.2
step 93
wedding 91.7
art 91.6
street 88.5
music 87.8
wear 87.1
outfit 86.4

Imagga
created on 2022-02-04

balcony 35.1
shop 34.1
architecture 33.9
building 28.3
old 27.9
barbershop 27.8
case 26
mercantile establishment 25.7
city 22.4
history 22.4
structure 21.7
religion 20.6
window 20.4
historic 19.2
ancient 19
house 18.6
altar 18.2
art 17.6
church 17.6
place of business 17.1
historical 16.9
facade 16.4
travel 14.8
culture 14.5
landmark 14.4
monument 14
antique 14
column 13.3
urban 13.1
famous 13
town 13
exterior 12.9
people 12.8
wall 12.8
home 12.8
cathedral 12.7
sculpture 12.6
tourism 12.4
stone 11.9
decoration 10.9
traditional 10.8
vintage 10.7
interior 10.6
religious 10.3
door 9.9
design 9.6
lamp 9.5
light 9.4
tradition 9.2
retro 9
palace 9
tower 9
style 8.9
statue 8.8
love 8.7
scene 8.7
boutique 8.6
establishment 8.5
street 8.3
inside 8.3
pattern 8.2
new 8.1
glass 7.9
temple 7.9
marble 7.9
pray 7.8
arch 7.8
god 7.7
frame 7.5
destination 7.5
bakery 7.3
black 7.2
indoors 7

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

wedding dress 99
bride 97.3
text 95.1
dress 88
window 83
black 73.8
wedding 65.9
woman 65.6
clothing 65.5
old 60.5
flower 58.8
person 50
store 46.7
altar 26.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 66.2%
Calm 99.5%
Sad 0.2%
Confused 0.1%
Surprised 0.1%
Disgusted 0%
Happy 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 36-44
Gender Male, 99.6%
Calm 69.7%
Sad 15%
Surprised 6.1%
Confused 5.1%
Happy 1.9%
Disgusted 1%
Angry 0.7%
Fear 0.3%

AWS Rekognition

Age 25-35
Gender Male, 50%
Calm 92.6%
Sad 5.4%
Happy 0.8%
Disgusted 0.3%
Fear 0.3%
Confused 0.3%
Angry 0.2%
Surprised 0.1%

AWS Rekognition

Age 19-27
Gender Female, 63.8%
Calm 29.9%
Happy 29%
Angry 17.8%
Sad 15.2%
Fear 3.8%
Surprised 2.3%
Confused 1.5%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.7%
Person 98.1%
Person 93.5%
Person 67.8%

Categories

Text analysis

Amazon

CLASS
1922
ETARTIN
ETARTIN ARGA
ARGA