Human Generated Data

Title

Untitled (bride and bridesmaids standing outside house)

Date

c. 1950

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18906

Human Generated Data

Title

Untitled (bride and bridesmaids standing outside house)

People

Artist: Bachrach Studios, founded 1868

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18906

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.4
Human 98.4
Person 97.6
Clothing 97.5
Apparel 97.5
Person 95.6
Person 93.1
Person 90.4
Person 90.3
Person 79.8
People 74.2
Female 66.3
Face 65.4
Crowd 62.8
Photography 61.9
Photo 61.9
Hat 59
Outdoors 58.5
Wedding 58
Girl 57.3
Robe 55.2
Fashion 55.2

Clarifai
created on 2023-10-22

people 99.9
group 99.1
wedding 98.9
adult 98.2
man 98.2
group together 98.1
bride 97.7
woman 97
veil 96.7
groom 95.2
wear 94.2
music 93.9
ceremony 92.9
musician 88.9
actor 88.9
many 88.2
street 87.9
dress 85.5
actress 84.9
leader 84.1

Imagga
created on 2022-03-05

cemetery 85.4
graffito 30.5
decoration 20.8
architecture 20.3
city 20
old 18.8
building 17.7
stone 14.8
house 14.2
landscape 14.1
structure 14
industrial 13.6
sky 13.4
travel 13.4
dirty 12.7
urban 12.2
wall 12.2
black 12
construction 12
scenery 11.7
history 11.6
scene 11.3
street 11
vintage 10.8
water 10.7
industry 10.2
grunge 10.2
dark 10
outdoor 9.9
cold 9.5
fence 9.4
winter 9.4
memorial 9.2
gravestone 9.1
rural 8.8
factory 8.7
ancient 8.6
tree 8.5
tunnel 8.3
exterior 8.3
road 8.1
country 7.9
sea 7.8
texture 7.6
tourism 7.4
brown 7.4
light 7.4
earth 7.3
business 7.3
transportation 7.2
trees 7.1
scenic 7
season 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 96.3
black and white 96.3
text 91.7
monochrome 57.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 97.6%
Calm 54%
Sad 44.7%
Confused 0.4%
Happy 0.3%
Surprised 0.2%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.4%
Person 97.6%
Person 95.6%
Person 93.1%
Person 90.4%
Person 90.3%
Person 79.8%

Categories

Text analysis

Amazon

86
|