Human Generated Data

Title

Untitled (men and women posed on porch of brick building)

Date

1925

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1871

Human Generated Data

Title

Untitled (men and women posed on porch of brick building)

People

Artist: Hamblin Studio, American active 1930s

Date

1925

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1871

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.7
Human 99.7
Clothing 99.2
Apparel 99.2
Person 99
Person 98.8
Person 98.1
Person 98.1
Person 97.2
Person 96.7
Person 91
Stage 84.9
Person 84.9
Coat 81.9
Person 81.1
Tie 78.7
Accessories 78.7
Accessory 78.7
Person 76.3
Clinic 73.7
People 68.1
Lab Coat 66.5
Portrait 62.7
Photography 62.7
Face 62.7
Photo 62.7
Female 60.5
Scientist 57.6
Robe 57.1
Fashion 57.1
Sleeve 56.4
Suit 56.1
Overcoat 56.1

Clarifai
created on 2023-10-25

people 99.7
adult 98.8
group 98.3
man 97.3
wear 93.3
woman 91.2
many 90.4
group together 89.4
gown (clothing) 86.6
leader 85.1
administration 83.1
medical practitioner 79.6
veil 78
scientist 75.5
education 75.1
uniform 74.3
portrait 74.2
several 72.4
outerwear 70.6
music 70.2

Imagga
created on 2021-12-14

windowsill 53.5
sill 42.8
structural member 32.1
people 22.9
support 21.3
monitor 19.4
man 17.5
indoor 16.4
person 16.3
male 15.6
window 14.7
business 14.6
office 13.8
art 13.2
computer 13.1
building 12.5
architecture 12.5
television 12.1
black 12
home 12
groom 11.2
glass 10.9
design 10.7
interior 10.6
businessman 10.6
indoors 10.5
device 10.5
adult 10.5
modern 10.5
couple 10.4
old 10.4
equipment 10.1
silhouette 9.9
history 9.8
room 9.6
professional 9.6
electronic equipment 9.6
love 9.5
color 9.5
newspaper 9
technology 8.9
style 8.9
family 8.9
group 8.9
urban 8.7
men 8.6
monument 8.4
communication 8.4
house 8.4
inside 8.3
vintage 8.3
light 8
looking 8
smiling 8
marble 7.9
face 7.8
portrait 7.8
corporate 7.7
sitting 7.7
laptop 7.7
clothing 7.6
businesspeople 7.6
desk 7.6
city 7.5
life 7.3
businesswoman 7.3
lifestyle 7.2
column 7.2
women 7.1
work 7.1
together 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

window 99.5
text 99.5
clothing 96.6
person 96.2
man 78.4
woman 77
wedding dress 70.9
dress 57
posing 50.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-67
Gender Male, 96.4%
Calm 92.4%
Happy 3.6%
Sad 1.3%
Angry 0.9%
Surprised 0.8%
Disgusted 0.5%
Confused 0.3%
Fear 0.2%

AWS Rekognition

Age 22-34
Gender Female, 85.1%
Calm 64.6%
Sad 28.7%
Angry 1.8%
Disgusted 1.5%
Confused 1.3%
Happy 1.2%
Fear 0.6%
Surprised 0.3%

AWS Rekognition

Age 50-68
Gender Male, 96.1%
Calm 99.5%
Angry 0.1%
Happy 0.1%
Surprised 0.1%
Sad 0.1%
Confused 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 36-54
Gender Female, 82.5%
Calm 56.2%
Sad 33.1%
Happy 5.9%
Confused 1.7%
Surprised 0.9%
Disgusted 0.8%
Fear 0.8%
Angry 0.5%

AWS Rekognition

Age 36-54
Gender Female, 85.9%
Calm 55.5%
Happy 38.7%
Sad 4.1%
Angry 0.9%
Surprised 0.3%
Confused 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 51-69
Gender Male, 83.8%
Calm 81.5%
Sad 9.1%
Happy 6%
Confused 2%
Surprised 0.5%
Angry 0.4%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 44-62
Gender Female, 65.5%
Calm 94.3%
Sad 3.3%
Angry 1%
Fear 0.7%
Happy 0.3%
Disgusted 0.2%
Surprised 0.2%
Confused 0.1%

AWS Rekognition

Age 32-48
Gender Female, 64.5%
Calm 91.2%
Sad 5.9%
Fear 1.1%
Happy 0.8%
Surprised 0.4%
Confused 0.3%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 51-69
Gender Female, 50.8%
Calm 99.3%
Surprised 0.3%
Sad 0.1%
Disgusted 0.1%
Happy 0.1%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 51-69
Gender Male, 97.2%
Calm 96.5%
Happy 1.3%
Sad 0.9%
Surprised 0.3%
Disgusted 0.3%
Angry 0.3%
Confused 0.2%
Fear 0.1%

AWS Rekognition

Age 49-67
Gender Male, 89.4%
Calm 63.1%
Sad 26.5%
Confused 2.7%
Happy 2.4%
Surprised 2.2%
Angry 1.6%
Fear 1%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Tie 78.7%