Human Generated Data

Title

Untitled (nine children posed with goat on outdoor steps)

Date

1961

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9823

Human Generated Data

Title

Untitled (nine children posed with goat on outdoor steps)

People

Artist: Martin Schweig, American 20th century

Date

1961

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9823

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.8
Human 99.8
Person 99.5
Person 99
Shelter 98.4
Nature 98.4
Outdoors 98.4
Building 98.4
Countryside 98.4
Rural 98.4
Person 96.8
Person 96.8
Housing 96.6
Person 95.7
Person 95.5
Person 92.7
Clothing 87
Apparel 87
People 81.5
House 81.2
Shorts 78.4
Transportation 74.8
Person 74.7
Vehicle 73.7
Kid 73.4
Child 73.4
Face 70.8
Tree 68.5
Plant 68.5
Land 66
Train 62.3
Female 60.8
Urban 60.1
Yard 59.3

Clarifai
created on 2023-10-27

people 99.9
child 99.4
adult 99.1
group 98.7
canine 98.3
woman 97.8
vehicle 96.8
man 96.5
group together 96.1
dog 95.3
veil 93.5
recreation 92.6
transportation system 92
sit 92
sitting 91.1
administration 90.3
wear 89.3
illustration 88
monochrome 87.6
many 86.9

Imagga
created on 2022-01-28

bench 29.1
park bench 24.4
kin 20.5
seat 18.1
man 16.9
old 16
people 15.1
park 14.9
outdoors 13.9
outdoor 13.8
black 13.2
furniture 12.6
portrait 12.3
wheeled vehicle 11.8
tree 11.6
lifestyle 11.6
child 11.4
vintage 10.7
adult 10.3
sitting 10.3
person 10.2
male 10
building 9.8
tricycle 9.7
antique 9.5
vehicle 9.5
culture 9.4
happy 9.4
architecture 9.4
smile 9.3
art 9.2
travel 9.1
landscape 8.9
couple 8.7
day 8.6
grunge 8.5
traditional 8.3
sky 8.3
fun 8.2
chair 8.1
water 8
interior 8
autumn 7.9
boy 7.8
sculpture 7.8
ancient 7.8
empty 7.7
structure 7.7
statue 7.6
leisure 7.5
holding 7.4
style 7.4
fountain 7.4
retro 7.4
decoration 7.3
home 7.2
religion 7.2
history 7.2
mother 7.1
love 7.1
snow 7.1
summer 7.1
rural 7
together 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

text 94.6
outdoor 93.2
black and white 87.2
horse 77.8
black 67.2
statue 52.6
old 47.6
seat 41.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 96.6%
Calm 96.5%
Sad 1.6%
Angry 0.5%
Happy 0.5%
Disgusted 0.3%
Fear 0.2%
Confused 0.2%
Surprised 0.1%

AWS Rekognition

Age 36-44
Gender Male, 93.7%
Calm 83.3%
Sad 7.3%
Happy 5.5%
Disgusted 1.1%
Confused 1%
Angry 0.8%
Surprised 0.6%
Fear 0.4%

AWS Rekognition

Age 31-41
Gender Male, 99.8%
Calm 42.3%
Sad 31.9%
Happy 22.3%
Fear 1%
Angry 0.9%
Confused 0.6%
Disgusted 0.5%
Surprised 0.4%

AWS Rekognition

Age 38-46
Gender Female, 88.1%
Calm 88.6%
Happy 6.8%
Sad 1.7%
Surprised 0.8%
Fear 0.7%
Confused 0.5%
Disgusted 0.5%
Angry 0.3%

AWS Rekognition

Age 29-39
Gender Female, 87.9%
Calm 98.1%
Sad 0.8%
Happy 0.4%
Surprised 0.2%
Fear 0.1%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 28-38
Gender Female, 97.3%
Calm 65.9%
Sad 25.4%
Happy 3.2%
Fear 2.4%
Confused 1.5%
Disgusted 0.7%
Angry 0.5%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.8%
Person 99.5%
Person 99%
Person 96.8%
Person 96.8%
Person 95.7%
Person 95.5%
Person 92.7%
Person 74.7%

Text analysis

Amazon

POICH
KODAK--SEEIA--EITW

Google

2102 A°2--AO
2102
A°2--AO