Human Generated Data

Title

Untitled (man and woman on chairs, people standing behind)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16706

Human Generated Data

Title

Untitled (man and woman on chairs, people standing behind)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16706

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 100
Apparel 100
Person 99.4
Human 99.4
Person 99.2
Person 98.8
Person 98.2
Person 97.9
Robe 96.9
Fashion 96.9
Gown 95.3
Person 94.5
Wedding 93.5
Person 92.9
Suit 91.5
Coat 91.5
Overcoat 91.5
Bride 87.8
Wedding Gown 87.8
Tie 76.9
Accessories 76.9
Accessory 76.9
Sunglasses 76.9
Female 73.5
Shirt 68.9
Bridegroom 68.7
Tie 68
Evening Dress 64.1
Person 64.1
Flooring 60.7
Long Sleeve 58.8
Sleeve 58.8
Woman 58.4
Tuxedo 57.9
People 57.8

Clarifai
created on 2023-10-29

people 99.7
group 98
woman 98
man 97.6
adult 96.8
indoors 88.6
wedding 88.5
medical practitioner 87.4
child 86.8
group together 86.3
three 82.1
veil 82.1
leader 78.6
wear 77.8
family 77.8
uniform 77.2
science 77.1
bride 77.1
monochrome 76.4
room 76.2

Imagga
created on 2022-02-26

nurse 30.2
people 29.5
male 29.1
man 28.9
adult 28.8
person 25.7
businessman 22.1
men 20.6
professional 20.2
portrait 20
happy 20
business 19.4
corporate 18.9
clothing 18.5
boutique 17.5
worker 16.9
job 15.9
medical 15.9
home 15.1
two 14.4
women 14.2
family 14.2
happiness 14.1
doctor 13.1
couple 13.1
smiling 13
office 12.8
smile 12.8
work 12.6
coat 12.3
standing 12.2
human 12
occupation 11.9
attractive 11.9
team 11.6
handsome 11.6
room 11.5
indoors 11.4
group 11.3
patient 11.3
modern 11.2
looking 11.2
health 11.1
indoor 10.9
businesswoman 10.9
lifestyle 10.8
interior 10.6
lab coat 10.6
fashion 10.5
brass 10.5
businesspeople 10.4
life 10.2
clinic 10
dress 9.9
hospital 9.8
colleagues 9.7
suit 9.6
black 9.6
building 9.5
adults 9.5
day 9.4
light 9.4
casual 9.3
old 9.1
care 9
medicine 8.8
together 8.8
outfit 8.7
30s 8.7
wind instrument 8.6
bright 8.6
career 8.5
clothes 8.4
pretty 8.4
mature 8.4
house 8.4
20s 8.2
cheerful 8.1
lady 8.1
window 8
mother 7.9
corporation 7.7
profession 7.7
elegance 7.6
teamwork 7.4
focus 7.4
inside 7.4
success 7.2
color 7.2
employee 7.2
love 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

wall 98.9
indoor 92.9
wedding dress 88.2
text 78.2
bride 75.2
clothing 69.4
person 67.2
wedding 65.7
room 48.9
bathroom 13.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 71%
Sad 98.4%
Fear 0.8%
Calm 0.2%
Angry 0.1%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 26-36
Gender Male, 88.7%
Sad 66%
Happy 24.1%
Calm 4.4%
Angry 1.6%
Disgusted 1.1%
Surprised 1%
Fear 0.9%
Confused 0.9%

AWS Rekognition

Age 48-54
Gender Male, 92.1%
Sad 32.5%
Calm 22.1%
Confused 17%
Fear 8.1%
Happy 7.4%
Angry 5.9%
Surprised 4%
Disgusted 3%

AWS Rekognition

Age 33-41
Gender Male, 88.1%
Calm 77%
Surprised 6.5%
Fear 4.5%
Sad 4.2%
Happy 2.4%
Confused 2%
Disgusted 1.8%
Angry 1.6%

AWS Rekognition

Age 33-41
Gender Female, 68.8%
Sad 53.1%
Calm 36.6%
Happy 5.7%
Confused 2.2%
Angry 1.1%
Disgusted 0.5%
Surprised 0.4%
Fear 0.4%

AWS Rekognition

Age 40-48
Gender Female, 89.5%
Happy 42.8%
Calm 33.7%
Confused 13.1%
Fear 4%
Sad 3%
Disgusted 1.4%
Surprised 1.3%
Angry 0.8%

AWS Rekognition

Age 49-57
Gender Male, 75.6%
Sad 88.9%
Happy 5.5%
Calm 2.1%
Confused 1.5%
Angry 0.7%
Surprised 0.5%
Disgusted 0.5%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Sunglasses
Person 99.4%
Person 99.2%
Person 98.8%
Person 98.2%
Person 97.9%
Person 94.5%
Person 92.9%
Person 64.1%
Tie 76.9%
Tie 68%
Sunglasses 76.9%

Categories

Imagga

interior objects 99.4%

Text analysis

Amazon

USN

Google

USN YT3A°2-AGO
USN
YT3A°2-AGO