Human Generated Data

Title

Untitled (bride tossing her bouquet outside church)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2793

Human Generated Data

Title

Untitled (bride tossing her bouquet outside church)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2793

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.2
Human 99.2
Person 98.5
Person 98.3
Person 98.1
Person 97.9
Person 91.6
Person 90.5
Person 88.3
Stage 80.9
Clothing 80.2
Apparel 80.2
Leisure Activities 75.4
Performer 70.7
Crowd 69.5
Musician 67.8
Musical Instrument 67.8
People 67.8
Flower 65.5
Blossom 65.5
Plant 65.5
Music Band 64.2
Suit 63.8
Overcoat 63.8
Coat 63.8
Sitting 58
Silhouette 57.6
Dance Pose 56.7
Mannequin 56.3

Clarifai
created on 2023-10-26

people 99.9
group 99.2
man 98.7
woman 97.7
adult 96.1
music 96
leader 94.3
group together 91.9
veil 90.5
many 90.2
musician 89.5
child 88.4
actor 88.4
ceremony 87.4
singer 87.2
administration 85.5
education 82.4
interaction 81.8
wedding 81.2
several 81

Imagga
created on 2022-01-16

brass 49.2
wind instrument 43.1
musical instrument 34
man 30.2
male 22.7
trombone 20.2
cornet 19.4
people 16.7
person 15.8
adult 13.5
statue 13.4
old 13.2
men 12.9
monument 12.1
couple 11.3
device 10.4
business 10.3
architecture 10.1
city 10
room 9.7
businessman 9.7
indoors 9.7
world 9.6
black 9
religion 9
history 8.9
sculpture 8.7
sitting 8.6
art 8.5
protection 8.2
suit 8.1
group 8.1
family 8
job 8
work 7.8
sax 7.5
historical 7.5
religious 7.5
tourism 7.4
church 7.4
building 7.2
dress 7.2
worker 7.1
love 7.1
mask 7.1

Google
created on 2022-01-16

Microsoft
created on 2022-01-16

person 96.4
text 94
clothing 91.9
concert 75.3
wedding dress 73.9
dance 70.7
man 67.3
musical instrument 66.5
woman 64.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 100%
Surprised 75.8%
Sad 8%
Confused 6.6%
Happy 4.2%
Disgusted 2.8%
Fear 1%
Calm 0.9%
Angry 0.7%

AWS Rekognition

Age 16-24
Gender Female, 96.9%
Surprised 35.8%
Happy 30.2%
Calm 26.6%
Sad 3.4%
Disgusted 1.4%
Angry 1.2%
Fear 1.1%
Confused 0.5%

AWS Rekognition

Age 49-57
Gender Male, 96.8%
Fear 95.3%
Sad 2.5%
Calm 0.8%
Surprised 0.5%
Disgusted 0.3%
Angry 0.2%
Happy 0.2%
Confused 0.1%

AWS Rekognition

Age 45-53
Gender Male, 69.9%
Calm 94.9%
Happy 4.3%
Sad 0.4%
Surprised 0.2%
Confused 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 23-33
Gender Male, 69%
Calm 83.5%
Happy 9.1%
Sad 5.9%
Confused 0.6%
Disgusted 0.3%
Angry 0.3%
Surprised 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Imagga

paintings art 91.7%
interior objects 7.8%

Text analysis

Amazon

7
FILM
KODAK
SAFETY

Google

KODAK SAFETY FILM 7
KODAK
SAFETY
FILM
7