Human Generated Data

Title

Untitled (studio portrait of men with strings of fish)

Date

c. 1930

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5719

Human Generated Data

Title

Untitled (studio portrait of men with strings of fish)

People

Artist: Durette Studio, American 20th century

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5719

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.5
Human 99.5
Person 99.3
Person 99
Person 99
Person 98
People 74.7
Face 73.4
Art 69.9
Drawing 62.2
Sketch 57.2

Clarifai
created on 2019-06-01

people 100
group 99.7
man 98.6
adult 97.9
many 97.3
wear 96.6
woman 95.8
group together 94.1
several 93.6
veil 90
outfit 88.7
medical practitioner 84.4
administration 81.8
leader 80.7
outerwear 78
child 77.1
military 72.2
four 71
vehicle 71
uniform 70.3

Imagga
created on 2019-06-01

sketch 75
drawing 61.9
representation 44.8
negative 39.6
film 31.3
art 23.8
grunge 23
photographic paper 22.7
vintage 19.8
old 18.1
photographic equipment 15.1
retro 14.7
antique 14.7
texture 14.6
graphic 13.1
frame 12.5
decoration 12.4
design 12.4
black 12
barbershop 11.3
architecture 10.9
paint 10.9
decorative 10.9
style 10.4
ancient 10.4
symbol 10.1
people 10
aged 9.9
silhouette 9.9
history 9.8
textured 9.6
pattern 9.6
artistic 9.5
paper 9.5
men 9.4
ice 9.3
shop 8.9
man 8.7
water 8.7
ornament 8.6
structure 8.5
wallpaper 8.4
backgrounds 8.1
religion 8.1
detail 8
flower 7.7
snow 7.7
worn 7.6
historical 7.5
dollar 7.4
business 7.3
border 7.2
glass 7.2
mercantile establishment 7.1
creative 7.1

Google
created on 2019-06-01

Photograph 96.9
Snapshot 83.9
Picture frame 63.5
Photography 62.4
Family 58.6
Team 52.3
Art 50.2

Microsoft
created on 2019-06-01

posing 96.7
window 91.7
old 90.3
person 71.7
clothing 68.2
group 55.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 52.7%
Angry 46.1%
Disgusted 45.3%
Confused 45.5%
Calm 48.3%
Sad 46%
Surprised 46.2%
Happy 47.6%

AWS Rekognition

Age 20-38
Gender Male, 51.6%
Confused 45.4%
Calm 49.4%
Sad 48.7%
Surprised 45.4%
Angry 45.5%
Disgusted 45.1%
Happy 45.6%

AWS Rekognition

Age 26-44
Gender Male, 54.7%
Happy 45.8%
Calm 53.3%
Angry 45.1%
Confused 45.2%
Surprised 45.2%
Sad 45.3%
Disgusted 45.1%

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Confused 45.3%
Happy 45.3%
Calm 52.9%
Disgusted 45.1%
Sad 45.5%
Angry 45.5%
Surprised 45.3%

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Disgusted 45.1%
Sad 45.2%
Surprised 45.2%
Happy 45.2%
Angry 45.1%
Calm 54%
Confused 45.1%

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

paintings art 98.8%
text visuals 1.2%

Text analysis

Amazon

HEI
rea