Human Generated Data

Title

Untitled (men in striped suits, seated, with musical instruments)

Date

1928

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2003

Human Generated Data

Title

Untitled (men in striped suits, seated, with musical instruments)

People

Artist: Hamblin Studio, American active 1930s

Date

1928

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.7
Person 99.7
Person 99.3
Person 99
Person 99
Person 97.2
Person 95.9
Room 93.8
Indoors 93.8
Person 93
People 90.7
Dressing Room 67.7
Person 64.1
Furniture 58.5

Imagga
created on 2021-12-14

window 28.7
door 23
sliding door 22.5
shop 19.5
people 19.5
black 16.2
light 15.4
barbershop 15.1
interior 15
house 14.2
urban 14
movable barrier 13.5
architecture 13.3
indoors 13.2
mercantile establishment 12.6
city 12.5
business 12.1
home 12
women 11.9
men 11.2
room 11.1
inside 11
indoor 10.9
dark 10.8
person 10.7
life 10.5
modern 10.5
group 10.5
table 10.4
motion 10.3
building 10.1
barrier 10.1
art 10.1
man 10.1
adult 9.8
night 9.8
glass 9.7
furniture 9.6
design 9.6
luxury 9.4
floor 9.3
silhouette 9.1
old 9
fashion 9
decoration 8.9
party 8.6
walking 8.5
place of business 8.5
elegance 8.4
human 8.2
boutique 8.2
celebration 8
decor 7.9
portrait 7.8
comfortable 7.6
two 7.6
hand 7.6
chair 7.6
happy 7.5
fun 7.5
style 7.4
symbol 7.4
paint 7.2
transportation 7.2
male 7.1
travel 7
structure 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 97.8
person 91.6
clothing 89.2
cartoon 83.7
drawing 72.6
man 57.5
old 49.2
posing 38.6

Face analysis

Amazon

Google

AWS Rekognition

Age 35-51
Gender Male, 57.8%
Calm 46.7%
Happy 29.1%
Confused 11%
Sad 6%
Surprised 3.7%
Angry 1.7%
Fear 1.1%
Disgusted 0.6%

AWS Rekognition

Age 42-60
Gender Male, 64.2%
Calm 82.9%
Sad 7.5%
Happy 5%
Surprised 1.9%
Angry 0.9%
Confused 0.9%
Disgusted 0.5%
Fear 0.4%

AWS Rekognition

Age 13-25
Gender Female, 75.2%
Calm 48.4%
Sad 45.5%
Happy 4.5%
Confused 1%
Angry 0.3%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 50-68
Gender Male, 59%
Sad 78.5%
Calm 10.1%
Happy 9.3%
Confused 0.9%
Fear 0.4%
Angry 0.4%
Disgusted 0.3%
Surprised 0.2%

AWS Rekognition

Age 52-70
Gender Male, 91.8%
Calm 77.3%
Disgusted 9.8%
Sad 8.2%
Happy 2%
Surprised 1.3%
Angry 0.5%
Confused 0.5%
Fear 0.4%

AWS Rekognition

Age 47-65
Gender Female, 60.8%
Calm 74.5%
Sad 9.2%
Confused 5.9%
Happy 4.3%
Surprised 3.4%
Fear 1.3%
Angry 1%
Disgusted 0.4%

AWS Rekognition

Age 24-38
Gender Male, 86.4%
Happy 91.9%
Calm 7.3%
Sad 0.4%
Angry 0.1%
Confused 0.1%
Surprised 0.1%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people posing for a photo 72.2%
a group of people posing for a photo in front of a window 67.2%
a group of people posing for the camera 67.1%