Human Generated Data

Title

Untitled (owners and employees outside automobile company)

Date

1950

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6273

Human Generated Data

Title

Untitled (owners and employees outside automobile company)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6273

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.8
Human 99.8
Person 99.7
Person 99.6
Person 99.6
Person 99.5
Person 99.2
Person 99
Person 98.6
Shop 91.8
Person 91.3
Door 67.7
People 67.6
Road 64.6
Clothing 62.6
Apparel 62.6
Car 58.9
Transportation 58.9
Vehicle 58.9
Automobile 58.9
Window Display 58.8
Postal Office 57.8
Bus Stop 55.5
Suit 55
Coat 55
Overcoat 55

Clarifai
created on 2023-10-26

people 99.7
group 96.7
child 95.9
man 95.2
education 94.6
adult 93.6
woman 93.5
school 93
war 91
boy 89.7
group together 88.8
administration 88
family 86
uniform 84.3
many 81.6
room 80.6
leader 79.6
soldier 78.2
music 77.7
elementary school 77.4

Imagga
created on 2022-01-22

building 39
architecture 34.4
office 27.1
city 22.4
urban 18.3
structure 17.1
business 16.4
window 16.2
old 16
house 15.9
center 14.8
travel 13.4
modern 13.3
exterior 12.9
sky 12.7
boutique 11.9
station 11.9
facade 11.8
landmark 11.7
glass 11.7
tourism 11.5
windows 11.5
facility 11.4
place 11.2
people 11.1
interior 10.6
buildings 10.4
university 10.3
museum 10.2
new 9.7
depository 9.6
door 9.6
roof 9.5
construction 9.4
historic 9.2
indoors 8.8
wall 8.7
shop 8.7
ancient 8.6
transportation 8.1
tower 8
hall 8
light 8
design 7.9
entrance 7.7
column 7.7
room 7.7
estate 7.6
commercial 7.5
wood 7.5
monument 7.5
floor 7.4
chair 7.4
man 7.4
transport 7.3
history 7.1
steel 7.1
temple 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 96.4
person 89.8
outdoor 89.2
clothing 86.9
man 73.4
group 59.5
store 32.5
line 22.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 89.3%
Confused 5.1%
Happy 1.9%
Disgusted 1.8%
Surprised 0.7%
Sad 0.5%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 21-29
Gender Male, 93.1%
Calm 87.2%
Confused 7.5%
Sad 2.3%
Happy 1.3%
Disgusted 0.5%
Angry 0.4%
Fear 0.4%
Surprised 0.4%

AWS Rekognition

Age 20-28
Gender Male, 83.2%
Happy 70.6%
Calm 9.6%
Sad 8.3%
Angry 6.1%
Surprised 2.9%
Confused 0.9%
Disgusted 0.9%
Fear 0.9%

AWS Rekognition

Age 24-34
Gender Male, 99.7%
Calm 92.8%
Happy 2.8%
Disgusted 2.3%
Surprised 0.8%
Confused 0.6%
Angry 0.3%
Sad 0.2%
Fear 0.2%

AWS Rekognition

Age 23-33
Gender Female, 58.6%
Calm 84%
Confused 4.5%
Sad 2.5%
Disgusted 2.3%
Happy 2.3%
Fear 1.9%
Angry 1.3%
Surprised 1.1%

AWS Rekognition

Age 43-51
Gender Male, 100%
Sad 33.7%
Confused 27.3%
Calm 18.2%
Happy 9.9%
Disgusted 4.6%
Angry 2.7%
Surprised 1.9%
Fear 1.7%

AWS Rekognition

Age 50-58
Gender Male, 66.8%
Happy 45%
Calm 33.6%
Surprised 8%
Sad 6.6%
Disgusted 3.2%
Angry 1.8%
Confused 1.4%
Fear 0.5%

AWS Rekognition

Age 25-35
Gender Male, 96.1%
Calm 96.2%
Confused 1.6%
Sad 1.2%
Disgusted 0.3%
Fear 0.3%
Angry 0.2%
Surprised 0.1%
Happy 0.1%

AWS Rekognition

Age 29-39
Gender Male, 99.4%
Calm 99.8%
Sad 0.1%
Angry 0.1%
Happy 0%
Disgusted 0%
Confused 0%
Surprised 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Categories

Text analysis

Amazon

DRIVE
NATIONAL
CM
HYDRA-MATIC
DEALER
KODIA

Google

HYORA
ATIAL
M HYORA MATIC ORIVE .... ATIAL
M
MATIC
ORIVE
....