Human Generated Data

Title

Untitled (three children petting dog on front steps of house)

Date

1953

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9460

Human Generated Data

Title

Untitled (three children petting dog on front steps of house)

People

Artist: Martin Schweig, American 20th century

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9460

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.8
Apparel 99.8
Person 99.4
Human 99.4
Person 99.3
Dress 99.1
Person 98.4
Female 97.4
Chair 96.6
Furniture 96.6
Face 95.8
Woman 89.5
Outdoors 82.1
Nature 82
Costume 82
Girl 80.4
Suit 80.2
Coat 80.2
Overcoat 80.2
Portrait 76.3
Photography 76.3
Photo 76.3
Shoe 72.8
Footwear 72.8
Building 72.5
Kid 72.3
Child 72.3
People 72
Man 66.4
Porch 64.2
Housing 64.1
Advertisement 63.7
Plant 62.4
Hat 62.1
Play 58.8
Collage 58.5
Poster 58.5
Baby 57.5
Tree 57.4
Door 56.9
Teen 56.2
Floor 55.3

Clarifai
created on 2023-10-26

people 99.8
two 97.8
child 97.7
group 97.7
adult 96.9
three 96.2
woman 94.1
man 93.7
group together 93.7
offspring 91
monochrome 89.7
recreation 89.2
wear 89.1
administration 89
leader 88.9
four 86.8
actress 86.6
lid 86.1
veil 85.7
vehicle 85.5

Imagga
created on 2022-01-23

blackboard 38.1
man 28.2
people 24.5
person 23.9
newspaper 23.7
laptop 20.9
male 19.8
computer 18.6
working 18.6
product 18.1
work 17.9
adult 17.1
business 17
men 14.6
creation 14.1
technology 14.1
black 13.8
worker 13.3
professional 13.2
businessman 13.2
sitting 12.9
job 12.4
smile 12.1
happy 11.9
lifestyle 11.6
office 11.2
scholar 10.9
lady 10.5
couple 10.4
room 10.2
musical instrument 9.8
attractive 9.8
old 9.7
portrait 9.7
success 9.7
bench 9
chair 8.9
boy 8.7
outside 8.6
modern 8.4
executive 8.4
pretty 8.4
building 8.3
outdoors 8.2
brass 8.2
cheerful 8.1
television 8.1
transportation 8.1
home 8
intellectual 7.9
women 7.9
standing 7.8
seat 7.8
travel 7.7
wall 7.7
casual 7.6
leisure 7.5
one 7.5
wind instrument 7.3
smiling 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.6
black and white 88.5
clothing 84.7
person 83.3
drawing 66.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Female, 54.4%
Calm 99.7%
Happy 0.2%
Sad 0.1%
Surprised 0%
Disgusted 0%
Confused 0%
Fear 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.4%
Person 99.3%
Person 98.4%
Shoe 72.8%

Captions

Text analysis

Amazon

270
MACOX
VSS
11351

Google

210 13S
210
13S