Human Generated Data

Title

Untitled (four men standing by airplanes in field)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2667

Human Generated Data

Title

Untitled (four men standing by airplanes in field)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2667

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Person 99.6
Person 99.3
Person 98.8
Airplane 98.3
Aircraft 98.3
Vehicle 98.3
Transportation 98.3
Airplane 91.5
Airport 90.8
Airfield 88.7
Shorts 82.9
Clothing 82.9
Apparel 82.9
Building 75.5
Biplane 71.1
People 66.5
Landing 65.4
Field 61.7
Hangar 57.1

Clarifai
created on 2023-10-26

aircraft 99.9
airplane 99.9
people 99.6
airport 98.9
vehicle 98.8
transportation system 98.7
group together 98.5
biplane 98.1
adult 96.7
military 95
aviate 93.9
group 89.3
man 89.2
war 87.9
many 87.7
child 84.9
flight 84.2
engine 81
wear 78.6
two 76.7

Imagga
created on 2022-01-15

camper 21.5
helmet 19.2
device 17.5
recreational vehicle 17.2
football helmet 17.1
sky 15.3
travel 14.1
self-propelled vehicle 13.3
wheeled vehicle 12.9
protection 12.7
cockpit 12.1
vehicle 12.1
earth 11.9
machine 11.8
sea 11.7
equipment 11.5
headdress 11.3
technology 11.1
negative 11.1
transport 11
industrial 10.9
clothing 10.7
business 10.3
industry 10.2
power 10.1
film 9.7
military 9.7
respirator 9.4
structure 9.4
light 9.4
iron lung 9.3
danger 9.1
black 9
protective 8.8
building 8.7
aircraft 8.5
truck 8.4
action 8.3
moon 8.2
team 8.1
person 7.7
mask 7.7
old 7.7
clouds 7.6
dark 7.5
globe 7.4
man 7.4
training 7.4
safety 7.4
air 7.4
island 7.3
global 7.3
metal 7.2
road 7.2
world 7.2
transportation 7.2
architecture 7

Microsoft
created on 2022-01-15

clothing 95.6
person 95.4
text 94.5
black and white 78.1
footwear 67.6
man 64.6
sky 58.9
umbrella 56.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 87.5%
Confused 52.1%
Sad 29.5%
Happy 5.4%
Disgusted 5.2%
Calm 4.5%
Surprised 1.7%
Angry 1.1%
Fear 0.5%

AWS Rekognition

Age 26-36
Gender Male, 99.1%
Sad 71.1%
Confused 9.1%
Calm 7.4%
Happy 5.3%
Disgusted 4%
Angry 1.3%
Surprised 1%
Fear 0.8%

AWS Rekognition

Age 39-47
Gender Male, 98.4%
Happy 53.3%
Sad 25.3%
Calm 17%
Disgusted 1.5%
Confused 1.2%
Angry 0.8%
Fear 0.5%
Surprised 0.5%

AWS Rekognition

Age 37-45
Gender Male, 96%
Sad 90.6%
Disgusted 3.3%
Calm 2.7%
Surprised 1.1%
Confused 0.8%
Fear 0.8%
Happy 0.4%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Airplane 98.3%

Categories

Imagga

paintings art 99.1%

Captions

Text analysis

Google

KODVK- 2VEEIA
KODVK-
2VEEIA