Human Generated Data

Title

Untitled (children in native american costumes posing with adult woman)

Date

1953

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6374

Human Generated Data

Title

Untitled (children in native american costumes posing with adult woman)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6374

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Clothing 99.8
Shorts 99.8
Apparel 99.8
Person 99.5
Human 99.5
Person 99.3
Person 99.2
Vehicle 97.6
Car 97.6
Transportation 97.6
Automobile 97.6
Person 96.3
Person 95.1
Person 90.2
Person 88.1
Chair 81.3
Furniture 81.3
Person 76.1
Child 72.1
Kid 72.1
People 68.1
Road 64.4
Car 63.3
Female 60.4
Girl 60.4
Sedan 58.2
Helmet 57.8
Path 57.6
Pedestrian 57
Asphalt 56.2
Tarmac 56.2

Clarifai
created on 2019-03-22

people 99.9
group together 99.7
group 98.6
vehicle 97.7
many 97.6
adult 97.2
man 95.9
several 94.8
recreation 93.1
wear 93.1
woman 92.3
athlete 90.3
transportation system 89.4
outfit 87.3
child 86.2
street 83.6
boy 83.5
monochrome 81.6
competition 79.7
five 78.6

Imagga
created on 2019-03-22

trombone 28.1
brass 27.3
pedestrian 24.7
people 22.8
wind instrument 22
man 20.8
black 17.4
city 16.6
person 15.5
musical instrument 15.4
business 14.6
crowd 14.4
bass 14.3
adult 14.2
device 13.3
urban 13.1
male 12.8
passenger 11.3
group 11.3
silhouette 10.8
men 10.3
women 10.3
chair 9.8
glass 9.6
walking 9.5
motion 9.4
lifestyle 9.4
active 9.4
light 9.3
street 9.2
human 9
body 8.8
music 8.4
health 8.3
transport 8.2
transportation 8.1
commuter 7.9
sport 7.8
musician 7.8
walk 7.6
athlete 7.6
outdoors 7.5
one 7.5
wine 7.4
runner 7.3
design 7.3
seat 7.3
color 7.2
portrait 7.1
player 7.1
indoors 7

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

person 98.8
outdoor 92
group 56.3
street 56.3
black and white 32.3
girl 15.4
monochrome 14.4
boy 9.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Male, 50.6%
Sad 47.4%
Happy 48.5%
Calm 47.2%
Angry 45.8%
Disgusted 45.2%
Confused 45.3%
Surprised 45.5%

AWS Rekognition

Age 10-15
Gender Female, 52.6%
Disgusted 45.3%
Angry 45.3%
Happy 45.6%
Confused 45.1%
Calm 52.7%
Surprised 45.3%
Sad 45.7%

AWS Rekognition

Age 17-27
Gender Male, 53.3%
Disgusted 45.5%
Angry 45.3%
Confused 45.2%
Sad 45.4%
Happy 45.4%
Calm 52.9%
Surprised 45.5%

AWS Rekognition

Age 26-43
Gender Female, 51.3%
Happy 46.3%
Sad 49.6%
Angry 45.8%
Surprised 45.6%
Confused 45.7%
Disgusted 46%
Calm 45.9%

AWS Rekognition

Age 10-15
Gender Male, 53%
Happy 45.1%
Sad 52.2%
Disgusted 45.1%
Angry 45.3%
Calm 46.9%
Surprised 45.2%
Confused 45.3%

Feature analysis

Amazon

Person 99.5%
Car 97.6%

Text analysis

Amazon

ARP
MINNE
MINNEKHA
ARP 5CHOL
831316
MINNEKHA MINNE YC4
5CHOL
KODIK
YC4
KODIK-AELA
KODIK-AELA 2OEELA i KODIK
2OEELA
i

Google

MINNE KHA
MINNE
KHA