Human Generated Data

Title

Untitled (two men talking next to machine processing grain)

Date

1958

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6471

Human Generated Data

Title

Untitled (two men talking next to machine processing grain)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6471

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Person 98.1
Human 98.1
Apparel 96.4
Clothing 96.4
Person 81.7
Face 73.5
Outdoors 73.3
Portrait 63.9
Photography 63.9
Photo 63.9
Nature 60.3
Hat 60.3
Weather 60
Suit 59
Coat 59
Overcoat 59
Machine 58.4
Spoke 58.4
Furniture 57.1
Chair 57.1
Silhouette 55.6
Sun Hat 55.2

Clarifai
created on 2019-03-22

people 99.6
adult 98.6
one 97.3
man 96.8
wear 96.6
portrait 90.6
woman 90
veil 89.4
vehicle 86.4
two 85.7
music 85.6
outfit 85
wedding 83.4
light 81.2
retro 77.1
facial expression 76.5
musician 76.4
recreation 76.4
vintage 76.3
dress 74.6

Imagga
created on 2019-03-22

man 22.2
canvas tent 20.5
person 19.6
male 16.3
people 14.5
black 14.4
adult 12.3
business 12.1
sitting 12
smile 11.4
light 11.4
groom 11.3
portrait 11
work 11
dress 10.8
suit 10.8
happy 10.6
computer 10.5
one 10.4
television 10.4
office 10.4
laptop 10.4
newspaper 10.1
human 9.7
businessman 9.7
tent 9.5
face 9.2
art 9.1
protective covering 9
color 8.9
worker 8.9
support 8.9
travel 8.4
dark 8.3
indoor 8.2
sensuality 8.2
style 8.2
grand piano 8.1
smiling 8
hair 7.9
windshield 7.9
negative 7.9
screen 7.8
modern 7.7
telecommunication system 7.7
structure 7.6
piano 7.6
fun 7.5
design 7.3
device 7.2
looking 7.2
blond 7.2
night 7.1
interior 7.1

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

black 71.8
old 67.1
art 67.1
infrared 58.9
black and white 54.5
monochrome 51
street 37.6
statue 36.8
winter 33.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Male, 74.1%
Disgusted 2.4%
Calm 49.5%
Confused 5.1%
Angry 4.5%
Sad 24.3%
Surprised 12.4%
Happy 1.8%

AWS Rekognition

Age 35-52
Gender Female, 84.8%
Confused 6%
Disgusted 4.9%
Happy 6.6%
Calm 58.8%
Sad 10.3%
Angry 7%
Surprised 6.5%

Feature analysis

Amazon

Person 98.1%

Categories

Captions