Human Generated Data

Title

Untitled (bride and bridesmaids, New York, NY)

Date

1904, printed later

People

Artist: Percy C. Byron, American 1879 - 1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3487

Human Generated Data

Title

Untitled (bride and bridesmaids, New York, NY)

People

Artist: Percy C. Byron, American 1879 - 1959

Date

1904, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3487

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Person 99.5
Person 99.2
Person 98.9
Person 98.4
Person 98.3
Person 97.9
Person 96
Person 94.7
Military 92.5
Military Uniform 89
Person 88.4
Officer 86.7
Person 83.6
Crowd 81.5
Clothing 69
Apparel 69
People 61.5
Sailor Suit 57.5
Funeral 56.7

Clarifai
created on 2023-10-26

people 100
group 99.1
many 98.6
adult 98.6
group together 95.9
child 95.3
wear 95.2
man 94.8
portrait 92.8
monochrome 92.3
administration 92.2
street 92
woman 89.9
crowd 87.5
dress 86.6
leader 86.3
boy 85.2
art 84.2
music 79.7
wedding 76.8

Imagga
created on 2022-01-23

silhouette 26.5
black 24
man 21.6
people 20.6
person 20.1
water 18
grunge 16.2
dark 15
male 14.2
art 13.4
adult 12.9
blackboard 12.8
world 11.9
texture 11.8
outfit 11.6
dirty 10.8
dance 10.4
sport 10.3
spectator 10.2
light 10
night 9.8
old 9.7
one 9.7
pattern 9.6
vintage 9.1
design 9
sunset 9
wet 8.9
landscape 8.9
sky 8.3
happy 8.1
group 8.1
cool 8
body 8
business 7.9
happiness 7.8
fashion 7.5
human 7.5
leisure 7.5
style 7.4
symbol 7.4
retro 7.4
clothing 7.3
drop 7.2
sun 7.2
paint 7.2
sexy 7.2
active 7.2
portrait 7.1
summer 7.1
businessman 7.1
travel 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.6
clothing 87.3
group 83.7
person 83
wedding dress 81.3
standing 78.2
old 72.1
bride 66.7
woman 65.4
white 64.9
people 60.2
black and white 56.1
dress 55.3
crowd 1.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-24
Gender Female, 98.8%
Calm 97.5%
Happy 0.8%
Sad 0.4%
Confused 0.4%
Disgusted 0.3%
Angry 0.2%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 21-29
Gender Female, 99.9%
Calm 81%
Surprised 6.1%
Sad 3.7%
Angry 2.6%
Confused 2.2%
Happy 2.2%
Fear 1.2%
Disgusted 0.9%

AWS Rekognition

Age 23-33
Gender Female, 70.3%
Happy 98.2%
Calm 0.7%
Sad 0.3%
Confused 0.2%
Angry 0.2%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 12-20
Gender Male, 98%
Calm 95.5%
Angry 1.5%
Happy 0.8%
Sad 0.6%
Surprised 0.5%
Disgusted 0.4%
Confused 0.4%
Fear 0.2%

AWS Rekognition

Age 18-26
Gender Female, 99.9%
Calm 99.6%
Sad 0.2%
Angry 0.1%
Confused 0.1%
Surprised 0%
Disgusted 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 24-34
Gender Female, 100%
Calm 43.7%
Disgusted 33.1%
Sad 14.1%
Angry 3.9%
Confused 2%
Fear 1.5%
Surprised 1%
Happy 0.7%

AWS Rekognition

Age 19-27
Gender Female, 96.9%
Calm 93.9%
Sad 1.5%
Disgusted 1%
Surprised 1%
Happy 1%
Angry 0.8%
Confused 0.7%
Fear 0.2%

AWS Rekognition

Age 16-24
Gender Female, 99.7%
Calm 80.4%
Sad 6%
Angry 5.1%
Happy 3.6%
Disgusted 2.1%
Confused 1.3%
Surprised 0.8%
Fear 0.6%

AWS Rekognition

Age 28-38
Gender Female, 99.2%
Calm 93.7%
Happy 2.3%
Sad 1.2%
Surprised 1.2%
Confused 0.6%
Angry 0.6%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 22-30
Gender Female, 100%
Fear 66.7%
Calm 12.9%
Disgusted 7.6%
Sad 6.2%
Confused 3.2%
Surprised 1.6%
Angry 1.2%
Happy 0.5%

AWS Rekognition

Age 31-41
Gender Female, 99.9%
Calm 64.9%
Disgusted 20.3%
Sad 5%
Confused 4.3%
Surprised 2.5%
Angry 1.8%
Fear 0.6%
Happy 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Text analysis

Amazon

17339
Bryinn

Google

Buron Y 17339
Buron
Y
17339