Human Generated Data

Title

Untitled (five photographs: horse-drawn sleigh in snow in front of building; family portrait on porch in front of house; group of people in front of house; couple in horse and carriage on street; studio portrait of a seated couple)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6164

Human Generated Data

Title

Untitled (five photographs: horse-drawn sleigh in snow in front of building; family portrait on porch in front of house; group of people in front of house; couple in horse and carriage on street; studio portrait of a seated couple)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6164

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Poster 99.4
Advertisement 99.4
Collage 99.4
Human 99.3
Person 99.3
Person 99.1
Person 99
Person 98.6
Person 98.1
Person 97.7
Person 93.8
Animal 85.9
Mammal 85.9
Horse 85.9
Person 81.2
Person 78.4
Person 74.6
Person 72.4
Person 68.1
Hair 66.1
Outdoors 63.2
Person 62.1
Person 60.2
Person 59
Clothing 58.1
Apparel 58.1
Suit 58.1
Coat 58.1
Overcoat 58.1
Person 52.5
Person 52.2
Person 48.8

Clarifai
created on 2019-05-30

people 99.9
group 99.3
man 98.3
adult 98.1
monochrome 97.6
woman 97.1
group together 94.7
many 93.6
street 93.4
furniture 93.3
wear 93.3
child 92.4
room 91.8
music 88.8
several 88
musician 87.5
one 87.3
two 85.8
war 84.2
outfit 84

Imagga
created on 2019-05-30

barbershop 62.7
shop 52.3
mercantile establishment 41.5
window 33.2
place of business 27.6
old 25.1
building 22.1
city 20.8
architecture 19.6
negative 17.9
art 17.6
vintage 16.5
grunge 16.2
black 15.6
film 15.5
history 15.2
urban 14.8
glass 14.8
antique 14.7
establishment 13.9
facade 13.4
house 13.4
historic 12.8
framework 12.1
ancient 12.1
wall 12
texture 11.8
dirty 10.8
retro 10.6
structure 10.5
historical 10.3
office 10.3
design 10.1
frame 10
paint 10
travel 9.9
tourism 9.1
decoration 8.8
graphic 8.8
man 8.7
light 8.7
supporting structure 8.4
symbol 8.1
door 8
water 8
home 8
interior 8
boutique 7.7
culture 7.7
sculpture 7.6
decorative 7.5
photographic paper 7.5
monument 7.5
silhouette 7.4
street 7.4
business 7.3
people 7.2
aged 7.2
night 7.1
sky 7

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

black and white 87.1
person 86.9
clothing 79.9
street 72.1
horse 67.3
people 59.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-65
Gender Female, 53.4%
Angry 46.1%
Surprised 45.2%
Confused 45.3%
Calm 48%
Happy 47.6%
Disgusted 45.1%
Sad 47.7%

AWS Rekognition

Age 45-63
Gender Male, 54.7%
Confused 45.5%
Happy 46.7%
Surprised 47.7%
Calm 49.3%
Sad 45.3%
Angry 45.3%
Disgusted 45.2%

AWS Rekognition

Age 35-55
Gender Male, 50.5%
Sad 49.6%
Angry 49.6%
Disgusted 49.6%
Calm 49.9%
Confused 49.7%
Happy 49.5%
Surprised 49.5%

AWS Rekognition

Age 26-43
Gender Male, 50.4%
Angry 49.5%
Disgusted 49.5%
Happy 49.5%
Sad 50.2%
Calm 49.7%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 35-53
Gender Male, 50.2%
Confused 49.6%
Happy 49.5%
Surprised 49.6%
Calm 49.8%
Sad 49.8%
Angry 49.6%
Disgusted 49.6%

AWS Rekognition

Age 38-59
Gender Male, 50.4%
Confused 49.5%
Sad 49.7%
Calm 49.6%
Surprised 49.5%
Disgusted 49.6%
Angry 50%
Happy 49.5%

AWS Rekognition

Age 20-38
Gender Female, 50.2%
Happy 49.5%
Confused 49.5%
Calm 49.5%
Angry 50.4%
Sad 49.5%
Disgusted 49.5%
Surprised 49.5%

AWS Rekognition

Age 26-44
Gender Female, 50.2%
Surprised 49.5%
Calm 49.6%
Happy 49.6%
Confused 49.5%
Sad 50.1%
Disgusted 49.5%
Angry 49.7%

AWS Rekognition

Age 26-43
Gender Male, 50.3%
Sad 49.8%
Happy 49.6%
Disgusted 49.9%
Surprised 49.5%
Calm 49.5%
Angry 49.6%
Confused 49.6%

AWS Rekognition

Age 48-68
Gender Female, 50.1%
Angry 49.5%
Sad 50.3%
Surprised 49.5%
Confused 49.5%
Happy 49.5%
Calm 49.5%
Disgusted 49.6%

AWS Rekognition

Age 23-38
Gender Male, 50%
Surprised 49.5%
Disgusted 49.7%
Confused 49.6%
Happy 49.6%
Angry 49.6%
Calm 49.7%
Sad 49.8%

AWS Rekognition

Age 35-55
Gender Male, 50.4%
Disgusted 50.4%
Happy 49.5%
Confused 49.5%
Surprised 49.5%
Calm 49.5%
Angry 49.6%
Sad 49.5%

AWS Rekognition

Age 35-52
Gender Female, 50.3%
Angry 49.6%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%
Happy 49.5%
Calm 49.5%
Sad 50.3%

AWS Rekognition

Age 12-22
Gender Female, 50%
Disgusted 49.8%
Angry 49.8%
Surprised 49.5%
Confused 49.5%
Happy 49.6%
Sad 49.6%
Calm 49.7%

Microsoft Cognitive Services

Age 70
Gender Male

Microsoft Cognitive Services

Age 59
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Horse 85.9%

Categories

Text analysis

Google

NCORE
NCORE